Skip to main content

Online Safety Bill

Volume 827: debated on Wednesday 1 February 2023

Second Reading

Moved by

My Lords, I am very glad to be here to move the Second Reading of the Online Safety Bill. I know that this is a moment which has been long awaited in your Lordships’ House and noble Lords from across the House share the Government’s determination to make the online realm safer.

That is what this Bill seeks to do. As it stands, over three quarters of adults in this country express a concern about going online; similarly, the number of parents who feel the benefits outweigh the risks of their children being online has decreased rather than increased in recent years, falling from two-thirds in 2015 to barely over half in 2019. This is a terrible indictment of a means through which people of all ages are living increasing proportions of their lives, and it must change.

All of us have heard the horrific stories of children who have been exposed to dangerous and deeply harmful content online, and the tragic consequences of such experiences both for them and their families. I am very grateful to the noble Baroness, Lady Kidron, who arranged for a number of noble Lords, including me, to see some of the material which was pushed relentlessly at Molly Russell whose family have campaigned bravely and tirelessly to ensure that what happened to their daughter cannot happen to other young people. It is with that in mind, at the very outset of our scrutiny of this Bill, that I would like to express my gratitude to all those families who continue to fight for change and a safer, healthier online realm. Their work has been central to the development of this Bill. I am confident that, through it, the Government’s manifesto commitment to make the UK the safest place in the world to be online will be delivered.

This legislation establishes a regulatory regime which has safety at its heart. It is intended to change the mindset of technology companies so that they are forced to consider safety and risk mitigation when they begin to design their products, rather than as an afterthought.

All companies in scope will be required to tackle criminal content and activity online. If it is illegal offline; it is illegal online. All in-scope platforms and search services will need to consider in risk assessments the likelihood of illegal content or activity taking place on their site and put in place proportionate systems and processes to mitigate those risks. Companies will also have to take proactive measures against priority offences. This means platforms will be required to take proportionate steps to prevent people from encountering such content.

Not only that, but platforms will also need to mitigate the risk of the platform being used to facilitate or commit such an offence. Priority offences include, inter alia: terrorist material, child sexual abuse and exploitation, so-called revenge pornography and material encouraging or assisting suicide. In practice, this means that all in-scope platforms will have to remove this material quickly and will not be allowed to promote it in their algorithms.

Furthermore, for non-priority illegal content, platforms must have effective systems in place for its swift removal once this content has been flagged to them. Gone will be the days of lengthy and arduous complaints processes and platforms feigning ignorance of such content. They can and will be held to account.

As I have previously mentioned, the safety of children is of paramount importance in this Bill. While all users will be protected from illegal material, some types of legal content and activity are not suitable for children and can have a deeply damaging impact on their mental health and their developing sense of the world around them.

All in-scope services which are likely to be accessed by children will therefore be required to assess the risks to children on their service and put in place safety measures to protect child users from harmful and age inappropriate content. This includes content such as that promoting suicide, self-harm or eating disorders which does not meet a criminal threshold; pornography; and damaging behaviour such as bullying.

The Bill will require providers specifically to consider a number of risk factors as part of their risk assessments. These factors include how functionalities such as algorithms could affect children’s exposure to content harmful to children on their service, as well as children’s use of higher risk features on the service such as livestreaming or private messaging. Providers will need to take robust steps to mitigate and effectively manage any risks identified.

Companies will need to use measures such as age verification to prevent children from accessing content which poses the highest risk of harm to them, such as online pornography. Ofcom will be able to set out its expectations about the use of age assurance solutions, including age verification tools, through guidance. This guidance will also be able to refer to relevant standards. The Bill also now makes it clear that providers may need to use age assurance to identify the age of their users to meet the necessary child safety duties and effectively enforce age restrictions on their service.

The Government will set out in secondary legislation the priority categories of content harmful to children so that all companies are clear on what they need to protect children from. Our intention is to have the regime in place as soon as possible after Royal Assent, while ensuring the necessary preparations are completed effectively and service providers understand clearly what is expected. We are working closely with Ofcom and I will keep noble Lords appraised.

My ministerial colleagues in another place worked hard to strengthen these provisions and made commitments to introduce further provisions in your Lordships’ House. With regard to increased protections for children specifically, the Government will bring forward amendments at Committee stage to name the Children’s Commissioner for England as a statutory consultee for Ofcom when it is preparing a code of practice, ensuring that the experience of children and young people is accounted for during implementation.

We will also bring forward amendments to specify that category 1 companies—the largest and most risky platforms—will be required to publish a summary of their risk assessments for both illegal content and material that is harmful to children. This will increase transparency about illegal and harmful content on in-scope services and ensure that Ofcom can do its job regulating effectively.

We recognise the great suffering experienced by many families linked to children’s exposure to harmful content and the importance of this Bill in ending that. We must learn from the horrific events from the past to secure a safe future for children online.

We also understand that, unfortunately, people of any age may experience online abuse. For many adults, the internet is a positive source of entertainment and information and a way to connect with others; for some, however, it can be an arena for awful abuse. The Bill will therefore offer adult users a triple shield of protection when online, striking the right balance between protecting the right of adult users to access legal content freely, and empowering adults with the information and tools to manage their own online experience.

First, as I have outlined, all social media firms and search services will need to tackle illegal content and activity on their sites. Secondly, the Bill will require category 1 services to set clear terms of service regarding the user-generated content they prohibit and/or restrict access to, and to enforce those terms of service effectively. All the major social media platforms such as Meta, Twitter and TikTok say that they ban abuse and harassment online. They all say they ban the promotion of violence and violent threats, yet this content is still easily visible on those sites. People sign up to these platforms expecting one environment, and are presented with something completely different. This must stop.

As well as ensuring the platforms have proper systems to remove banned content, the Bill will also put an end to services arbitrarily removing legal content. The largest platform category 1 services must ensure that they remove or restrict access to content or ban or suspend users only where that is expressly allowed in their terms of service, or where they otherwise have a legal obligation to do so.

This Bill will make sure that adults have the information they need to make informed decisions about the sites they visit, and that platforms are held to their promises to users. Ofcom will have the power to hold platforms to their terms of service, creating a safer and more transparent environment for all.

Thirdly, category 1 services will have a duty to provide adults with tools they can use to reduce the likelihood that they encounter certain categories of content, if they so choose, or to alert them to the nature of that content. This includes content which encourages, promotes, or provides instructions for suicide, self-harm or eating disorders. People will also have the ability to filter out content from unverified users if they so wish. This Bill will mean that adult users will be empowered to make more informed choices about what services they use, and to have greater control over whom and what they engage with online.

It is impossible to speak about the aspects of the Bill which protect adults without, of course, mentioning freedom of expression. The Bill needs to strike a careful balance between protecting users online, while maintaining adults’ ability to have robust—even uncomfortable or unpleasant—conversations within the law if they so choose. Freedom of expression within the law is fundamental to our democracy, and it would not be right for the Government to interfere with what legal speech is permitted on private platforms. Instead, we have developed an approach based on choice and transparency for adult users, bounded by major platforms’ clear commercial incentives to provide a positive experience for their users.

Of course, we cannot have robust debate without being accurately informed of the current global and national landscape. That is why the Bill includes particular protections for recognised news publishers, content of democratic importance, and journalistic content. We have been clear that sanctioned news outlets such as RT, formerly Russia Today, must not benefit from these protections. We will therefore bring forward an amendment in your Lordships’ House explicitly to exclude entities subject to sanctions from the definition of a recognised news publisher.

Alongside the safety duties for children and the empowerment tools for adults, platforms must also have effective reporting and redress mechanisms in place. They will need to provide accessible and effective mechanisms for users to report content which is illegal or harmful, or where it breaches terms and conditions. Users will need to be given access to effective mechanisms to complain if content is removed without good reason.

The Bill will place a duty on platforms to ensure that those reporting mechanisms are backed up by timely and appropriate redress mechanisms. Currently, internet users often do not bother to report harmful content they encounter online, because they do not feel that their reports will be followed up. That too must change. If content has been unfairly removed, it should be reinstated. If content should not have been on the site in question, it should be taken down. If a complaint is not upheld, the reasons should be made clear to the person who made the report.

There have been calls—including from the noble Lord, Lord Stevenson of Balmacara, with whom I look forward to working constructively, as we have done heretofore—to use the Bill to create an online safety ombudsman. We will listen to all suggestions put forward to improve the Bill and the regime it ushers in with an open mind, but as he knows from our discussions, of this suggestion we are presently unconvinced. Ombudsman services in other sectors are expensive, often underused and primarily relate to complaints which result in financial compensation. We find it difficult to envisage how an ombudsman service could function in this area, where user complaints are likely to be complex and, in many cases, do not have the impetus of financial compensation behind them. Instead, the Bill ensures that, where providers’ user-reporting and redress mechanisms are not sufficient, Ofcom will have the power to take enforcement action and require the provider to improve its user-redress provisions to meet the standard required of them. I look forward to probing elements of the Bill such as this in Committee.

This regulatory framework could not be effective if Ofcom, as the independent regulator, did not have a robust suite of powers to take enforcement actions against companies which do not comply with their new duties, and if it failed to take the appropriate steps to protect people from harm. I believe the chairman of Ofcom, the noble Lord, Lord Grade of Yarmouth, is in his place. I am glad that he has been and will be following our debates on this important matter.

Through the Bill, Ofcom will have wide-ranging information-gathering powers to request any information from companies which is relevant to its safety functions. Where necessary, it will be able to ask a suitably skilled person to undertake a report on a company’s activity—for example, on its use of algorithms. If Ofcom decides to take enforcement action, it can require companies to take specific steps to come back into compliance.

Ofcom will also have the power to impose substantial fines of up to £18 million, or 10% of annual qualifying worldwide revenue, whichever is higher. For the biggest technology companies, this could easily amount to billions of pounds. These are significant measures, and we have heard directly from companies that are already changing their safety procedures to ensure they comply with these regulations.

If fines are not sufficient, or not deemed appropriate because of the severity of the breach, Ofcom will be able to apply for a court order allowing it to undertake business disruption measures. This could be blocking access to a website or preventing it making money via payment or advertising services. Of course, Ofcom will be able to take enforcement action against any company that provides services to people in the UK, wherever that company is located. This is important, given the global nature of the internet.

As the Bill stands, individual senior managers can be held criminally liable and face a fine for failing to ensure their platform complies with Ofcom’s information notice. Further, individual senior managers can face jail, a fine or both for failing to prevent the platform committing the offences of providing false information, encrypting information or destroying information in response to an information notice.

The Government have also listened to and acknowledged the need for senior managers to be made personally liable for a wider range of failures of compliance. We have therefore committed to tabling an amendment in your Lordships’ House which will be carefully designed to capture instances where senior managers have consented to or connived in ignoring enforceable requirements, risking serious harm to children. We are carefully designing this amendment to ensure that it can hold senior managers to account for their actions regarding the safety of children, without jeopardising the UK’s attractiveness as a place for technology companies to invest in and grow. We intend to base our offence on similar legislation recently passed in the Republic of Ireland, as well as looking carefully at relevant precedent in other sectors in the United Kingdom.

I have discussed the safety of children, adults, and everyone’s right to free speech. It is not possible to talk about this Bill without also discussing its protections for women and girls, who we know are disproportionately affected by online abuse. As I mentioned, all services in scope will need to seek out and remove priority illegal content proactively. There are a number of offences which disproportionately affect women and girls, such as revenge pornography and cyberstalking, which the Bill requires companies to tackle as a priority.

To strengthen protections for women in particular, we will be listing controlling or coercive behaviour as a priority offence. Companies will have to take proactive measures to tackle this type of illegal content. We will also bring forward an amendment to name the Victims’ Commissioner and the domestic abuse commissioner as statutory consultees for the codes of practice. This means there will be a requirement for Ofcom to consult both commissioners ahead of drafting and amending the codes of practice, ensuring that victims, particularly victims and survivors of domestic abuse, are better protected. The Secretary of State and our colleagues have been clear that women’s and girls’ voices must be heard clearly in developing this legislation.

I also want to take this opportunity to acknowledge the concerns voiced over the powers for the Secretary of State regarding direction in relation to codes of practice that currently appear in the Bill. That is a matter on which my honourable friend Paul Scully and I were pressed by your Lordships’ Communications and Digital Committee when we appeared before it last week. As we explained then, we remain committed to ensuring that Ofcom maintains its regulatory independence, which is vital to the success of this framework. As we are introducing ground-breaking regulation, our aim is to balance the need for the regulator’s independence with appropriate oversight by Parliament and the elected Government.

We intend to bring forward two changes to the existing power: first, replacing the “public policy” wording with a defined list of reasons that a direction can be made; and secondly, making it clear that this element of the power can only be used in exceptional circumstances. I would like to reassure noble Lords—as I sought to reassure the Select Committee—that the framework ensures that Parliament will always have the final say on codes of practice, and that strong safeguards are in place to ensure that the use of this power is transparent and proportionate.

Before we begin our scrutiny in earnest, it is also necessary to recognise that this Bill is not just establishing a regulatory framework. It also updates the criminal law concerning communication offences. I want to thank the Law Commission for its important work in helping to strengthen criminal law for victims. The inclusion of the new offences for false and threatening communications offers further necessary protections for those who need it most. In addition, the Bill includes new offences to criminalise cyberflashing and epilepsy trolling. We firmly believe that these new offences will make a substantive difference to the victims of such behaviour. The Government have also committed to adding an additional offence to address the encouragement or assistance of self-harm communications and offences addressing intimate image abuse online, including deep- fake pornography. Once these offences are introduced, all companies will need to treat this content as illegal under the framework and take action to prevent users from encountering it. These new offences will apply in respect of all victims of such activity, children as well as adults.

This Bill has been years in the making. I am proud to be standing here today as the debate begins in your Lordships’ House. I realise that noble Lords have been waiting long and patiently for this moment, but I know that they also appreciate that considerable work has already been done to ensure that this Bill is proportionate and fair, and that it provides the change that is needed.

A key part of that work was conducted by the Joint Committee, which conducted pre-legislative scrutiny of the Bill, drawing on expertise from across both Houses of Parliament, from all parties and none. I am very glad that all the Members of your Lordships’ House who served on that committee are speaking in today’s debate: the noble Baroness, Lady Kidron; the noble Lords, Lord Stevenson of Balmacara and Lord Knight of Weymouth, who have very helpfully been called to service on the Opposition Front Bench; the noble Lord, Lord Clement-Jones, who speaks for the Liberal Democrats; as well as my noble friends Lord Black of Brentwood and Lord Gilbert of Panteg.

While I look forward to the contributions of all Members of your Lordships’ House, and will continue the open-minded, collaborative approach established by my right honourable friend the Secretary of State and her predecessors—listening to all ideas which are advanced to make this Bill as effective as it can be—I urge noble Lords who are not yet so well-versed in its many clauses and provisions, or who might be disinclined to accept at first utterance the points I make from this Dispatch Box, to consult those noble Lords before bringing forward their amendments in later stages of the Bill. I say that not to discourage noble Lords from doing so, but in the spirit of ensuring that what they do bring forward, and our deliberations on them, will be pithy, focused, and conducive to making this Bill law as swiftly as possible. In that spirit, I shall draw my already too lengthy remarks to a close. I beg to move.

My Lords, like many in your Lordships’ House, I am relieved to be finally speaking on the Second Reading of this important Bill. I am very grateful to the Minister for his introduction. Despite being central to a recent manifesto and having all-party support, it has taken nearly six years to get us to this moment, as the Minister alluded to. A revolving door of four Prime Ministers and seven changes in Secretary of State have not exactly been conducive to this process.

But it is also fair to say that the Bill has been strengthened by consultation and by the detailed pre-legislative scrutiny carried out by the Joint Committee, to whom I pay tribute. It means that this version of the Bill bears a very welcome resemblance to the Joint Committee’s report. I also thank the Communications and Digital Select Committee for its ongoing work and warmly acknowledge the long-term campaigning work of the noble Baroness, Lady Kidron, and others in and outside this House.

It seems that every passing week reminds us why stronger online regulation is needed. Just today, we read that the influence of Andrew Tate, despite his being in custody in Romania, has whipped up a storm of rape and death threats directed to my colleague in the other place, Alex Davies-Jones. And writ large is the damning verdict of the inquest into Molly Russell’s death. I want to pay tribute to the determination of her father, Ian, who is present with us today.

In today’s digital age, social media is everywhere: in our homes, workplaces and schools. With the rise of virtual reality, it is also in our heads. It is a central influence on what we buy and think, and how we interact and behave. The power and money at stake are enormous, yet the responsibilities are minimal and accountability lacking.

The focus of this long and complex Bill is on reducing the seemingly ever-increasing harms caused by social media services and search engines, whose algorithms generate detailed pictures of who we are and push us towards certain types of content, even if it impacts on our physical and mental health. As we know, Molly Russell tragically took her own life after having been bombarded with material relating to depression, self-harm and suicide.

Many platforms have upped their game since, but the need for this legislation has not diminished: there remain too many cases of children and vulnerable adults being exposed to digital content that is simply not appropriate. I welcome the arrival of the Bill, but it is too late and, due to recent changes, arguably too narrow. We must now do what we can to get it on the statute book as soon as possible.

The Government have committed to changes in your Lordships’ House, but we need to see the detail, and soon, not least because of the significant public and stakeholder interest. It has become fashionable to leave major changes to legislation until Report stage, leaving noble Lords unsighted and limiting the scope for improvement. I hope the Minister will commit to bucking this trend and give noble Lords early sight of the Government’s thinking.

On these Benches, we will, as always, work constructively with colleagues across the House, and hopefully with the Minister too, as we have already been doing. But, in so doing, we must acknowledge that this Bill is unlikely to be the last word. A future Labour Government will want to return to these issues, to tidy up any deficiencies that are identified once the Bill becomes law.

I now turn to some of our priorities. I am in no doubt that other noble Lords will add to this list. There is a legitimate concern around the decision of Ministers to take powers of direction over what is supposed to be an independent regulator and to leave so much to secondary legislation. The need for flexibility is indeed understood, but Parliament must have an active role, rather than being sidelined.

On the protection of children, despite notable progress by many platforms, too many failings exist. Several children’s charities have put forward important recommendations. The NSPCC has called for user advocacy to influence future regulation, while Barnardo’s wants restrictions on access to online pornography, holding the Government to their previous promises.

The scrapping of legal but harmful provisions means a lack of protection for vulnerable adults. The Samaritans, for example, is keen to ensure that self-harm provisions properly capture vulnerable adults as well as children. We understand that defining the term is difficult, but a solution has to be found.

On anti-Semitism, racism and general abuse, the Government shifted policy in response to a former Conservative leadership hopeful who said that we cannot legislate for hurt feelings. We believe in free speech, but it is not clear that DCMS has found the right balance with its triple shield. The toggle system may prevent users from seeing categories of harmful material, but it will still exist and influence others unless the Government compel an auto-on setting.

On violence against women and girls, I welcome the commitments made in relation to cyberflashing and making controlling behaviour a priority offence. I hope the Minister confirms that there will be work with an extensive range of relevant stakeholders to build on the amendments already made, and to identify and close potential loopholes in forthcoming text.

We find it unacceptable that the Government have stripped back the Bill’s media literacy provisions at a time when these skills are more important than ever. I am grateful to organisations such as Full Fact for highlighting the need to equip people of all ages, but particularly children, with the skills necessary to identify misinformation and disinformation. We have all seen the damage caused by vaccine disinformation, not only on Covid but on HPV. This extends to other areas; social media is awash with misleading material on nutrition, breastfeeding and natural health remedies, to name but a few. Once again, we acknowledge that some platforms perform well in response to such issues, but the recent takeover of Twitter has highlighted how swiftly and radically that can change.

I know that the Minister has been working on this agenda for some time and that he wants to get it right. We can all share our own experiences or those of friends or family in respect of online harm and abuse. We can also all cite ways in which technological innovation has improved our lives. We therefore all have a stake in improving this legislation. We have a long and complex process ahead of us, but uniquely there is no political divide on the Bill. Therefore I hope that in the finest traditions of your Lordships’ House we will work together to improve what is before us, while recognising that this is unlikely to be the last word.

My Lords, it is a pleasure to follow the noble Lord, Lord Parkinson, and the noble Baroness, Lady Merron, and the spirit of co-operation they have both shown in introducing the Bill. On our side we will be led by my noble friend Lord Clement-Jones, who is keeping his powder dry for the summing up.

I was pleased that there was praise for the pre-legislative scrutiny, which is a very useful tool in our locker. I was a member of the Puttnam committee, which in 2002 looked at what became the last Communications Act, and I took two lessons from that. The first was the creation of Ofcom as a regulator with teeth; it is important that we go forward with that. The other was the Puttnam amendment adding the protection of citizens’ interests to that of consumer interests as part of its responsibilities. Those twin responsibilities—to the consumer and the citizen—are valuable when addressing this Bill.

It is worth remembering that, although it may be a future Labour Government who deal with this, my experience is that this is not a dress rehearsal; this is the main event and we should seize the day. It has been 20 years since the last Bill, six years since the Green Paper, and five years since the White Paper, with a cavalcade of Secretaries of State. This House is entitled to stress-test and kick tyres in today’s debate and in Committee to see if the powers and scope meet the threats, challenges and opportunities posed by this technology.

We will play our part in delivering a Bill which is fit for purpose, but the Government must play theirs by being flexible in their approach in response to legitimate concerns and sensible amendments addressing them. The noble Baroness, Lady Merron, has already voiced concerns about powers left in the hands of future Secretaries of State. We will study what has been said this afternoon on those matters.

We welcome the Bill’s focus on protecting children. I do not think anybody who went to the presentation on the evidence in the Molly Russell inquest could have left with anything other than a determination that something must be done about this. Equally, the concerns of End Violence Against Women and other groups pose questions on whether this legislation goes far enough in the protections needed, which will have to be tested. There are real worries about the lack of minimum requirements for terms of service and the removal of risk assessment for adults. The noble Lord, Lord Bethell, has been raising very pertinent questions about age verification and access to pornography. The noble Lord, Lord Lipsey, and I intend to raise questions in Committee about the free pass given to newspapers by this legislation, although much of their activity is now online. There is no specific commitment, as has been said, to expand media literacy, despite it being a major recommendation of the Puttnam committee 20 years ago.

The internet has been an amazing catalyst for change, innovation and creativity. But those benefits have come at a price of targeted actions designed to cause harms to individuals and institutions. On all Benches we believe that freedom of expression is important, but liberal democracies have a right to provide a framework of protection against those who seek to harm it. Much will depend on the response to legislation and regulation by the internet companies. The public are not stupid; they can differentiate between tick-box exercises and compliance, between profit maximisation and social responsibility. The noble Lord, Lord Grade, is also not stupid and I wish him well as chair of Ofcom.

My work on the Puttnam committee 20 years ago was among the most satisfying of my parliamentary life. I hope we will all have similar feelings when we complete our work on this Bill.

My Lords, I declare my interests as chair of 5Rights Foundation and the Digital Futures Commission, my positions at Oxford and LSE and at the UN Broadband Commission and the Institute for Ethics in AI, as deputy chair of the APPG on digital regulation and as a member of the Joint Committee on this Bill.

As has already been mentioned, on Monday I hosted the saddest of events, at which Ian Russell and Merry Varney, the Russell family’s solicitor, showed parliamentarians images and posts that had been algorithmically recommended to Molly in the lead-up to her death. These were images so horrible that they cannot be shown in the media, so numerous that we could see only a fraction, and so full of despair and violence that many of the adult professionals involved in the inquest had to seek counselling. Yet in court, much of this material was defended by two tech companies as being suitable for a 14 year-old. Something has gone terribly wrong. The question is: is this Bill sufficient to fix it?

At the heart of our debates should not be content but the power of algorithms that shape our experiences online. Those algorithms could be designed for any number of purposes, including offering a less toxic digital environment, but they are instead fixed on ranking, nudging, promoting and amplifying anything to keep our attention, whatever the societal cost. It does not need to be like that. Nothing about the digital world is a given; it is 100% engineered and almost all privately owned; it can be designed for any outcome. Now is the time to end the era of tech exceptionality and to mandate a level of product safety so that the sector, just like any other sector, does not put its users at foreseeable risk of harm. As Meta’s corporate advertising adorning bus stops across the capital says:

“The metaverse may be virtual, but the impact will be real.”

I very much welcome the Bill, but there are still matters to discuss. The Government have chosen to take out many of the protections for adults, which raises questions about the value and practicality of what remains. In Committee, it will be important to understand how enforcement of a raft of new offences will be resourced and to question the oversight and efficacy of the remaining adult provisions. Relying primarily on companies to be author, judge and jury of their own terms of service may well be a race to the bottom.

I regret that Parliament has been denied the proper opportunity to determine what kind of online world we want for adults, which, I believe, we will regret as technology enters its next phase of intelligence and automation. However, my particular concern is the fate of children, whose well-being is collateral damage to a profitable business model. Changes to the Bill will mean that child safety duties are no longer an add-on to a generally safer world; they are now the first and only line of defence. I have given the Secretary of State sight of my amendments, and I inform the House that they are not probing amendments; they are necessary to fill the gaps and loopholes in the Bill as it now stands. In short, we need to ensure that child safety duties apply to all services likely to be accessed by children. We must ensure the quality control of all age-assurance systems. Age checking must not focus on a particular harm, but on the child; it needs to be secure, privacy-preserving and proportionate, and it must work. The children’s risk assessment and the list of harms must cover each of the four Cs: content harm, conduct harm, contact harm and commercial harm, such as the recommendation loops of violence and self-hatred that push thousands of children into states of misery. Those harms must be in the Bill.

Coroners and bereaved parents must have access to data relevant to the death of a child to end the current inhumane arrangement whereby bereaved families facing the devasting loss of their child are forced to battle, unsuccessfully, with tech behemoths for years. I hope that the Minister will reiterate commitments made in the other place to close that loophole.

Children’s rights must be in the Bill. An unintended consequence of removing protections for adults is that children will now cost companies vastly more developer time, more content moderation and more legal costs than adults. The digital world is the organising technology of our society, and children need to be online for their education and information to participate in civic society—they must not be kicked out.

I thank all those who have indicated their support, and the Secretary of State, the Minister and officials for the considerable time they have given me. However, I ask the Minister to listen very carefully to the mood of the House this evening; the matters I have raised are desperately urgent and long-promised, and must now be delivered unequivocally.

While millions of children suffer from the negative effects of the online world, some pay with their lives. I am a proud supporter of a group of bereaved parents for online safety, and I put on the record that we remember Molly, Frankie, Olly, Breck, Sophie and all the others who have lost their lives. I hope that the whole House will join me in not resting until we have a Bill fit for their memory.

My Lords, that is not an easy speech to follow, but I begin by declaring my interest as a Church Commissioner, as set out in the register. We have substantial holdings in many of the big tech companies. I am also vice-chair of the Church of England Ethical Investment Advisory Group. I commend the attention of noble Lords to our recent report on big tech that was published last September. There, we set out five core principles that we believe should guide our investment in and engagement with big tech companies: flourishing as persons, flourishing in relationships, standing with the marginalised, caring for creation and serving the common good. If we apply those principles to our scrutiny of this Bill, we will not only improve lives but save lives.

I will focus my remaining remarks on three areas. First, as the noble Baroness, Lady Merron, and the noble Lord, Lord McNally, have noted, the powers granted to the Secretary of State to direct Ofcom on its codes of practice and provide tactical and strategic guidance put Ofcom’s independence at risk. While I recognise that the Government have sought to address these concerns, more is required—Clauses 39 and 157 are not fit for purpose in their present form. We also need clear safeguards and parliamentary scrutiny for Secretary of State powers in the Bill that will allow them to direct Ofcom to direct companies in whatever we mean by “special circumstances”. Maintaining Ofcom’s autonomy in decision-making is critical to preserving freedom of expression more broadly. While the pace of technological innovation sometimes requires very timely response, the Bill places far too much power in the hands of the Secretary of State.

Secondly, while the Bill encompasses activity within the remit of regulators beyond Ofcom, it is largely silent on formal co-operation. I encourage the Government to introduce a general duty to co-operate with other regulators to ensure a good and effective enforcement of the various regulatory regimes. I would be grateful if the Minister could confirm whether the Government will commit to looking at this once more.

Finally, I turn, as others have done, to the protection of children. The noble Baroness, Lady Kidron, has just spoken powerfully. Can we really claim that this Bill serves to mitigate the harm that children face online when consultation of children has so far been lacking? I welcome the Minister’s remarks about the Children’s Commissioner in this regard, but we can and should go further. In particular, we should centre our decisions on promoting children’s well-being rather than on simply minimising harm. My right reverend friend the Bishop of Durham regrets that he is unable to be in his place today. I know he plans to raise these questions as the Bill progresses.

Related to this, we must ensure that any activity online through which children are groomed for criminal exploitation is monitored. A reporting mechanism should be brought in so that such information is shared with the police. My right reverend friend the Bishop of Derby is unable to speak today, but as vice- chair of the Children’s Society, she will follow these issues closely.

This Bill has arrived with us so late and so overcrowded that I had begun to think it was being managed by my good friends at Avanti trains. However, here at last it is. I look forward to working with noble Lords to improve this important and welcome legislation. It is my hope that, as we continue to scrutinise and improve the Bill, we will move ever closer to fulfilling those five core principles I set out: flourishing as persons, flourishing in relationships, standing with the marginalised, caring for creation and serving the common good.

My Lords, I draw attention to my interests as a trustee of the Loughborough Wellbeing Centre, director of Santander and the Financial Services Compensation Scheme, chair of the Association of British Insurers and board member at Grayling. In fact, I could draw attention to all my interests, because what we are debating today, with online search engines and online platforms, are organisations that reach into every corner of our lives now. I want to thank current Ministers for getting us to this stage. We have heard that this is long overdue regulation. I plead guilty to being one of the “cavalcade” of previous Secretaries of State mentioned by the noble Lord, Lord McNally, but I am pleased that I have played my part in keeping this Bill on the road.

When we have passed this legislation, the UK will be world leading. That needs to be recognised, but it also means that this legislation is new and not easy, as we have heard. Polling from More in Common has said that in a list of six comparative European countries, the British are most likely to say that the Government are not doing enough to regulate social media platforms. In the brief time available, I want to set out some key themes and amendments which I hope to raise in Committee.

I welcome the criminal offences relating to violence against women and girls added to the Bill, but the whole environment of these platforms, where such online violence has become normalised and misogyny allowed to flourish unchecked, needs to change. I am afraid that adding selected offences is insufficient, and I will be calling for a specific code of practice, to be drafted by Ofcom, that the platforms and search engines will need to follow to show that they are taking the proliferation of violence against women and girls seriously.

We will hear today many arguments about freedom of speech and expression, but what about the right to access and participation online without being abused and harassed? Online violence against women and girls curtails women’s freedom of expression. The advice to avoid social media—which I myself, as a Member of Parliament, received from the authorities and the police—respects no one’s freedoms. As we have heard, women and girls are 27 times more likely to experience harassment online.

We have also heard from Luke Pollard in the other place a mention of incels. While this is a complicated topic, unfortunately what is true is that data from the Center for Countering Digital Hate has found that visits to incel websites are only increasing every day, and the content on them is getting more extreme. Many small platforms hosting incels set their own terms and conditions, allowing for violent and misogynistic discussions. How the Bill tackles those issues will be of great importance and a subject of discussion in this House.

I was disappointed that the legal but harmful restrictions were dropped, but I understand why Ministers chose to do so. However, I agree that, as we have already heard, the user empowerment toggle should be set to “on” by default. Just because a user decides not to see abusive and harmful content does not mean that it is not there, either influencing others or, where it is unfortunately necessary, for the user to see so that they can provide evidence to the authorities, including the police. I include my own experience of having seen that abuse, gathering it and then sending it to the authorities. If we have the toggle set to “off”, in relation to violence against women and girls the onus will yet again be on women to protect themselves, rather than the abuser being compelled to cease their abuse. Related themes to explore in Committee will be the minimum standards needed for risk assessments, as well as minimum standards for platforms’ terms and conditions; the publication of risk assessments to create a culture of transparency on the part of service providers; and further detail on how the information gathered by Ofcom under Clause 68 is to be used.

We will hear discussion—we already have—about the welcome creation of the offence of sending communication which encourages serious self-harm. However, as we have heard, Samaritans has pointed out that all such content needs to be regulated across all platforms for all users. Turning 18 does not stop young people being vulnerable to suicide or self-harm content. I also support the calls by Vicky Ford and others to specifically include eating disorders within the self-harm clause.

It was my pleasure last year to chair this House’s special committee on the Fraud Act 2006 and digital fraud. Time is short, but there will be more to say on the issues of fraud, as well as independent researchers’ access to information. My noble friend the Minister has mentioned senior manager liability. We will wait to see what the clause introduced says, but it needs to be sufficiently tough to change the culture.

I will absolutely support the amendment proposed by the noble Baroness, Lady Kidron, and that proposed by my noble friend Lord Bethell, on age verification for online pornography.

I was recently at an event in this building with tech companies, including a major search engine, who complained that, via the Bill, the Government are experimenting on them. I put it to them then, and I say now, that these companies have experimented on us, particularly our children and vulnerable adults, for years without facing the consequences of the illegal and harmful material across their platforms and search engines. The Bill is long overdue. I look forward to the debates and amendments.

My Lords, it is a privilege to follow the noble Baroness, Lady Morgan—and slightly intimidating. I draw the House’s attention to my register of interests: I am a director of the Antisemitism Policy Trust and a director of HOPE not hate, and I remain the chief executive of Index on Censorship. I have also had appalling experiences online. In all these capacities I have been intimately involved with the passage of this legislation over the last two years. Like every one of your Lordships, I desperately want to see a better and safer internet for all users, especially children and the most vulnerable, but I worry about the unintended consequences of certain clauses, particularly for our collective and legal right of freedom of expression.

There are certain core premises that should guide our approach to online regulation. What is legal offline should be legal online. We need secure and safe communication channels to protect us all of us, but especially dissidents and journalists, so end-to-end encryption needs to be safeguarded. Our ability to protect our identities online can be life-saving, for domestic violence victims as much as for political dissidents, so we need to ensure that the principle of online anonymity is protected. Each of these principles is undermined by the current detail of the Bill, and I hope to work with many of your Lordships in the weeks ahead to add additional safeguards.

However, some of my greatest concerns about the current proposals relate to illegal content: the definition of what is illegal, the arbiters of illegality and, in turn, what happens to the content. The current proposals require the platforms to determine what is illegal content and then delete it. In theory this seems completely reasonable, but the reality will be more complicated.

I fear what a combination of algorithms and corporate prosecution may mean for freedom of expression online. The risk appetite of the platforms is likely to be severely reduced by this legislation. Therefore, I believe that they are likely to err on the side of caution when considering where the illegality threshold falls, leading to over-deletion. This will be compounded by the use of algorithms rather than people to detect nuance and illegal content.

I will give your Lordships an example of an unintended consequence this has already led to. A video of anti-government protests in Lebanon was deleted on some current platforms because an algorithm picked up only one word of the Arabic chants: Hezbollah, an organisation rightly proscribed in the UK. But the video actually featured anti-Hezbollah chants. It was an anti-extremism demonstration and, I would speculate, contained anti-extremist messaging that many of us would like to see go viral rather than be deleted.

Something is already twice as likely to be deleted from a platform by an algorithm if it is in Urdu or Arabic, rather than English. This will become even more common unless we tighten the definition of illegality and provide platforms with a digital evidence locker where content can be stored before a final decision on deletion is made, thus protecting our speech online.

The issue of deletion is deeply personal for me. Many of your Lordships may be aware that, as a female Jewish Labour Member of the other place, I was subjected to regular and vicious anti-Semitic and misogynist online abuse—abuse that too often became threats of violence and death. Unfortunately, these threats continue and have a direct effect on my personal security. I know when I am most vulnerable because I see a spike in my comments online. These comments are monitored—thankfully not by me—and, when necessary, are referred to the police, with the relevant evidence chain, so that people can be prosecuted.

Can the Minister explain how these people will be prosecuted for harassment, or worse, if the content is automatically deleted? How will I know if someone is threatening to kill me if the threat has already gone? I genuinely believe that the Government wish to make people safer online, as do we all, but I fear that this Bill will not only curtail free speech online but make me and others much less safe offline. There is significant work to do to make sure that is not the case.

My Lords, the internet is a double-edged sword. It enables people to connect with work, education, information and social activities. It gives visibility to those often hidden from society. But it can be a dangerous place for many, especially disabled people, many of whom are vulnerable to attack merely for who they are. I want to focus on the indiscriminate abuse that disabled people face online.

In January 2019, the Petitions Committee published its report Online Abuse and the Experience of Disabled People, following a petition by Katie Price about her son Harvey. The committee heard evidence of extreme levels of abuse, not only on social media but in online games, web forums and in media website comments. As one disabled poet and writer wrote:

“I’ve been called an ‘it’ many times—‘What is IT doing?’ … I’ve had remarks about how I look in my wheelchair, and a few times the statements, ‘You should have been aborted’, and, ‘You don’t deserve to live’”,

and, “Why are you online?” The committee rightly concluded that the law was not fit for purpose.

The Bill does not do enough to address such abuse. The other place recently weakened the protections for disabled people, replacing the provisions on legal but harmful content with a triple shield of duties to remove illegal content for adults and harmful content for under-18s, and to empower adult users.

Under Clause 12, social media companies must now tackle content which is abusive or incites hatred towards disabled people. That is encouraging, but it is the companies that decide that, so in practice it may not change anything. We know that moderating social media is the Wild West. There is no consistency between platforms. It depends on the algorithms they use and the discretion of their moderators.

Clause 18 adds to those problems, requiring platforms also to consider freedom of expression and privacy issues. They will be in an impossible position, caught between competing claims for protection from abuse and freedom of speech. At the very least, the legal but harmful provisions must be restored.

Greater control for disabled people using social media is laudable. They must be consulted on the best way to achieve that. The Bill says that terms of service must be “clear and accessible”. It should provide for Ofcom to give guidance with input from disabled people. It should not be left to social media services to set their own standards.

Consistency is also vital for the way the verification process works. Clause 57 refers to “verification … of any kind” and “clear and accessible” explanations. Ofcom’s guidance will be crucial on both issues, with disabled people’s input essential. It should be mandatory to follow the guidance.

Will the Minister assure me that he will address these matters before Committee? Will he meet me and disability organisations which have expertise in this field for guidance? This is a landmark Bill and very welcome. Let us ensure that it works for everybody, especially those who need it most.

My Lords, it is an honour and privilege to follow the noble Baroness, Lady Campbell, and all those who have spoken in this debate. As a member of your Lordships’ Committee on Artificial Intelligence and a founding member of the Centre for Data Ethics and Innovation, I have followed the slow progress of this Bill since the original White Paper. We have seen increasing evidence that many social media platforms are unwilling to acknowledge, let alone prevent, harms of the kind this vital Bill addresses. We know that there is an all too porous frontier between the virtual world and the physical world. The resulting harms damage real lives, real families, and real children, as we have heard.

There is a growing list of priority harms and now there is concern, as well as excitement, over new AIs such as ChatGPT; they demonstrate yet again that technology has no inherent precautionary principles. Without systemic checks and balances, AI in every field develops faster than society can respond. We are and for ever will be catching up with the technology.

The Bill is very welcome, marking as it does a belated but important step towards rebalancing a complex but vital aspect of public life. I pay tribute to the Government and to civil servants for their patient efforts to address a complex set of ethical and practical issues in a proportionate way. But the job is not yet fully done.

I will concentrate on three particular areas of concern with the draft Bill. First, removal of risk assessments regarding harm to adults is concerning. Surely every company has a basic moral duty to assess the risk of its products or services to customers and consumers. Removal can only undermine a risk-based approach to regulation. Can the Minister explain how conducting a risk assessment erodes or threatens freedom of speech? My second concern, mentioned by others, is the Secretary of State’s powers in relation to Ofcom. This country has a record of independence of our own media regulators. Others have touched on that, so I will not elaborate. The third area of concern I wish to raise is the Bill’s provision—or rather lack of provision—over disinformation of various kinds. I currently serve on your Lordships’ Environment and Climate Change Committee; climate disinformation and medical disinformation inflict substantial harms on society and must be included in user empowerment tools.

Other right reverend Prelates will raise their own concerns in the forthcoming Committee. My right reverend friend the Bishop of Gloucester believes that it is imperative that we prevent technology-facilitated domestic abuse, as well as bring in a code of practice to keep women and girls safe online. To help young people flourish, we should look at controlling algorithmically served content, restrictions on face and body-editing apps, as well as improving media literacy overall. She is unable to speak today, but will follow these issues closely.

The Bill is vital for the health of children and adults, and the flourishing of our whole society. I look forward to progress being made in this House.

My Lords, I refer to my registered interests, in particular my work with Common Sense Media, a US not-for-profit that is focused on internet safety for children. What a pleasure it is to follow the right reverend Prelate the Bishop of Oxford—my local bishop, no less. I always find it a great thing that it is our Bishops who read their speeches from iPads; we have iBishops in this Chamber who are far more technologically advanced than the rest of us. What a pleasure it is to see our national treasure the Arts Minister on the Front Bench; yesterday he launched the 2021 report of the Portable Antiquities Scheme, which displays ancient treasures dug up from many centuries ago. I thought he might be presented with the first consultation paper on the Online Safety Bill, because it has taken so long to get to the stage where we are today.

A dozen years ago, when we talked about the impact of the internet, we were actually focused on copyright infringement; that was the big issue of the day. It is quite instructive to think about what happened there; it was a combination of technology, but also business solutions, licensing and the creation of companies such as Spotify that had an impact. But piracy remains with us, and will continue to remain with us because of the internet.

I like to think that the Jurassic journey of the Online Safety Bill began with an Adjournment debate by the then Member for Devizes, Claire Perry, who began a debate about protecting children from adult content on the internet, which is one of the most important issues. That led to her being commissioned to do a review by the then Prime Minister, David Cameron, and that began the ball rolling. But Prime Minister David Cameron’s biggest intervention, which I remember well, was to tackle Google on the issue of child sex abuse. At the time the prevailing mood, which still prevails, was that politicians do not understand technology—you cannot regulate the internet, “Get your tanks off our lawn”. But Cameron said, “We will legislate unless you do something”, and Google, which said it was impossible, eventually came up with something like 150,000 search terms which would give a non-search return and refer the searcher to get some help, frankly—that is what the page would come up with.

That was instructive because it was a combination of government action, but in tackling child sexual abuse we had relied on not-for-profits, such as the Internet Watch Foundation. As we debate a piece of legislation and call on the Government to do this or that, it is important to remember that the internet has always had many governors, if you like—civic society, business, not-for-profits and charities—all of which must continue to play an important role in internet policing, as must the platforms themselves, where technology has improved in leaps and bounds. We have heard some of the criticisms of the technology they use and the impact it has on the people who are relied on by some of these technology companies to police content. Nevertheless, they have made progress. We must also remember that the platforms are not publishers or broadcasters; they are still new technology.

I unequivocally support the Bill—frankly, in whatever form it takes once your Lordships have fully considered it. It must be passed because it is time to regulate the internet. Ofcom is absolutely the right regulator to do this. I have been hugely impressed by the amount of work it has put into preparing for this role. The overall approach taken in the Bill is the right one: to police not every piece of content but the terms and conditions. This week, Ofcom published a very important document pointing out that transparency, holding the platforms to account and exposing how they regulate their content will make a massive difference.

The Government have made the right compromise on legal but harmful. I counsel against the Christmas tree effect of wanting to hang every single different concern on to the Bill; let us keep our eye on the prize. Having said that, I will fully support my noble friend Lord Bethell in his points on age verification and the noble Baroness, Lady Kidron, with her amendment.

This is the end of the beginning. The Bill will not eradicate all the nasty things we see on the internet but, for the first time, the platforms will be accountable. It is very important to support this legislation. The Minister did not mention the European Union’s important legislation on this issue, but we are beginning to make progress across the world.

My Lords, it is a pleasure to follow other noble Lords on this issue. This legislation is undoubtedly long overdue. Without doubt, the internet has changed the way in which we live our lives. For many this change has been positive. However, the internet, in particular social media, has created a toxic online world. We have only to listen to the noble Baroness, Lady Kidron, and my noble friend Lady Anderson to realise that. As a result, the internet has become abusive, misogynistic and dangerous. Many noble Lords from across the House have personal experience of this toxic world of online abuse. Any measures that seek to place curbs and limits on that type of content are to be welcomed.

While it is important to protect adults from abuse online, it is more important that we get the Bill’s protections right for children. I welcome its provisions in respect of age verification, but for many across the House it is a surprise that we are even debating age verification. Legislation was passed in 2017 but inexplicably not implemented by the Government. That legislation would have ensured that age verification was in place to protect children over five years ago. While the Bill includes age assurance measures, it is disappointing that its provisions are not as robust as those passed in 2017. Also, it is concerning that age verification is not uniformly applied across Parts 3 and 5. What actions and steps will the Minister and his colleagues take in Committee with government amendments on this issue?

As this Bill makes progress through this House, it will be important to ensure that age verification is robust and consistent, but we must also ensure that what happened to the Digital Economy Act cannot be allowed to happen to this legislation. The Government cannot be allowed to slow down or even abandon age verification measures. This Bill, while welcome, needs to be amended to ensure that age verification is actually implemented and enforced. This must happen as quickly as possible after the Bill becomes law. I believe that age verification should be in place no later than six months after this Bill is passed.

The need for robust age verification is beyond any reasonable argument. Children should be protected from viewing harmful content online. The law in this regard should be simple. If a platform contains pornographic content, children should be prevented from viewing it. More than that, pornography that is prohibited offline should be prohibited online. Reading the provisions of this Bill carefully, it is my belief that the Bill falls short in both regards.

I look forward to the passage of this Bill through the House and, while it is a very welcome development to be discussing and having this Bill, it is important that the provisions and clauses within it are totally strengthened.

My Lords, I have two observations, two pleas, one offer of help and four minutes to deliver all this, so here goes.

Observation one is that this Bill is our answer to the age-old question of “quis custodiet ipsos custodes?” or, in the vernacular, “Who watches the watchmen?” With several thousand strokes of the pen, Parliament is granting to itself the power to tell tens of thousands of online services how they should manage their platforms if they wish to access the UK market. Parliament will give directions to Ofcom about the outcomes it wants to see and Ofcom will translate these into detailed instructions and ensure compliance through a team of several hundred people that the platforms will pay for. In-scope services will be given a choice—pay up and follow Ofcom’s instructions or get out of the UK market. We are awarding ourselves significant superpowers in this Bill, and with power comes great scrutiny as I am sure will happen in this House.

My second observation is that regulating online content is hard. It is hard because of scale. If regulating traditional media is like air traffic controllers managing a few thousand flights passing over the UK each day, then regulating social media is more like trying to control all the 30 million private cars that have access to UK roads. It is hard because it requires judgment. For many types of speech there is not a bright line between what is legal and illegal so you have to work on the basis of likelihoods and not certainties. It is hard because it requires trade-offs—processes designed to remove “bad” content will invariably catch some “good” content and you have to decide on the right balance between precision and recall for any particular system, and the noble Baroness, Lady Anderson of Stoke-on-Trent, has already referred to some of these challenges with specific examples.

I make this observation not to try and elicit any sympathy for online services, but rather some sympathy for Ofcom as we assign it the most challenging of tasks. This brings me to my first plea, which is that we allow Ofcom to make decisions about what constitutes compliance with the duties of care in the Bill without others second-guessing it. Because judgments and trade-offs are a necessary part of content moderation, there will always be people who take opposing views on where lines should have been drawn. These views may come from individuals, civil society or even Ministers and may form important and valuable input for Ofcom’s deliberations. But we should avoid creating mechanisms that would lead to competing and potentially conflicting definitions of compliance emerging. One chain of command—Parliament to Ofcom to the platforms—is best for accountability and effective regulation.

My second plea is for us to avoid cookie banner syndrome. The pop-ups that we all click on when visiting websites are not there for any technical reason but because of a regulatory requirement. Their origins lie in a last-minute amendment to the e-privacy directive from Members of the European Parliament who had concerns about online behavioural advertising. In practice, they have had little impact on advertising while costing many millions and leaving most users at best mildly irritated and at worst in greater risk as they learn to click through anything to close banners and get to websites.

There are several elements in this Bill that are at risk of cookie banner syndrome. Measures such as age and identity verification and content controls can be useful if done well but could also be expensive and ineffective if we mandate solutions that look good on paper but do not work in practice. If you see me mouthing “cookies” at you as we discuss the Bill, please do not see it as an offer of American biscuits but as a flag that we may be about to make an expensive mistake.

This brings to me to my final point, which is an offer of technical advice for any noble Lords trying to understand how the Bill will work in practice: my door and inbox are always open. I have spent 25 years working on internet regulation as poacher turned gamekeeper, turned poacher, turned gamekeeper. I may have a little more sympathy with the poachers than most politicians, but I am all gamekeeper now and keen to see this Bill become law. For those who like this kind of thing, I share more extensive thoughts on the Bill than I can get into four minutes in a blog and podcast called “Regulate Tech”.

My Lords, I thank Mencap and the Royal College of Psychiatrists for their briefings. I will speak against the change in the other place which waters down the protections offered to adults, and focus in particular on adults without capacity.

The original Bill included protections for adults under the umbrella of “legal but harmful”, which gave robust directions to platforms on what content to remove. These protections must be reinstated; the triple shield is not enough. Your Lordships are presented with a system where social media platforms must filter only

“to the extent that it is proportionate to do so”,

assuming that all adults are capacitous all of the time and that they will be responsible for making their own choices to avoid seeing harmful content.

I recognise that there is an intended new duty for services to undertake a risk assessment on the impact of certain material on children, and to tackle the promotion of sites which share harmful content and to prevent children witnessing it, but this applies just to children. I agree with my noble friend Lady Kidron that tech companies must design for safety, just as we expect in the physical environment.

My main point is that there is no clear distinction between childhood and adulthood when it comes to mental health. I am concerned about the mental health consequences for anybody, whether child or adult, of seeing some of the images, messaging and push notifications which relentlessly pursue anyone who has ever engaged with one of the horrific sites like those seen by 14 year-old Molly Russell. These images are harmful to 14 year-olds; they are harmful to 24 year-olds; and they are harmful to 74 year-olds. Once seen, it is very hard to unsee them.

Misinformation and negative messaging are harmful to anyone who may struggle to belong and feel valued, whether at a vulnerable moment in their lives or as part of an ongoing struggle with depression. One in 20 Google searches is for health-related information. People in the UK apparently make 27 searches a minute for “depression”, 22 a minute for “stress”, and 21 a minute for anxiety. Given the waiting times for mental health support in the community, perhaps it is unsurprising that people seek help online. This Bill must have an emphasis on prevention. The Bill places duties on regulated providers but, as of June 2022, more than 500 hours of video were uploaded to YouTube every minute. This is content created and viewed by its users at a rate where any reactionary approach is doomed to fall quickly behind.

As legislators we must think of society as a whole, not just those who are fully engaged and economically productive citizens who currently feel invulnerable. Making sure that legislation works for people with a learning disability and those who may not have the understanding needed to protect themselves from harmful content should not be an add-on. Could the Minister suggest how the Bill could deliver greater protections to people with a learning disability or other cognitive or mental health reason for increased risk of online harm?

As I have said before, if we could get it right for people with learning disabilities, we could actually get it right for everyone.

My Lords, I am humbled to speak in this debate among many noble Lords who have spent years involved in or campaigning for this landmark legislation. I salute all of them and their work.

Like many, I support some parts of this Bill and am sceptical about others. The tension between free speech, privacy and online safety is not an easy one to resolve. We all accept, however reluctantly, that one Bill cannot cure all social ills—indeed, neither should it try. In fact, when it comes to online regulation, this is not the only legislation that is urgent and necessary: the digital markets, competition and consumer Bill is a critical, yet still missing, piece of the jigsaw to us achieving a strong regulatory framework. I hope the Government will bring it forward swiftly.

As my noble friend Lord Vaizey has already said, I see this Bill as the beginning of online regulation and not the end. I see it as our opportunity to make a strong start. For me, the top priority is to get the regulatory fundamentals right and to ensure we can keep updating the regime as needed in the years ahead. With my chair of the Communications and Digital Committee hat on, I will focus on key changes we believe are needed to achieve that. As I cannot do that justice in the time available, I direct any keen readers to our committee’s website, where my letter to the Secretary of State is available.

First, the regulator’s independence is of fundamental importance, as the noble Baroness, Lady Merron, and others have already mentioned. The separation of powers between the Executive and the regulator is the cornerstone of media regulation in western Europe. Any government powers to direct or give guidance should be clearly defined, justified and limited in scope. The Online Safety Bill, as it stands, gives us the opposite. Future Governments will have sweeping powers to direct and interfere with Ofcom’s implementation of the regulations.

I will come, in a moment, to my noble friend the Minister’s proposed remedy, which he mentioned in his opening remarks, but I stress that this is not a general complaint from me or the committee about executive overreach. Many of the Bill’s executive powers are key to ensuring the regime is responsive to changing needs, but there are some powers that are excessive and troubling. Clause 39 allows the Secretary of State to direct Ofcom to change its codes of practice on regulating social media firms. That is not about setting priorities; it is direct and unnecessary interference. In our view, the Government’s proposed amendment to clarify this clause, as my noble friend described, remains inadequate and does not respect the regulator’s independence. Clause 39 also empowers the Secretary of State to direct Ofcom in a private form of ping-pong as it develops codes of practice. This process could in theory go on for ever before any parliamentary oversight comes into play. Other powers are equally unnecessary. Clause 157 contains unconstrained powers to give “guidance” to Ofcom about any part of its work, to which it must have regard. Again, I fail to see the need, especially since the Government can already set strategic priorities and write to Ofcom.

Moving on, my committee is also calling for risk assessments for adult users to be reinstated, and this has already been mentioned by other noble Lords. That would have value for both supporters and critics of “legal but harmful”, by requiring platforms to be transparent about striking the balance between allowing adult users to filter out harmful content and protecting freedom of speech and privacy.

Finally, given the novel nature of the Bill, I hope the Government will reconsider their unwillingness to support the setting up of a Joint Committee of Parliament to scrutinise digital regulation across the board. This would address many general and specific concerns about implementation and keeping pace with digital developments that have been raised recently. Parliament needs to properly discharge its responsibilities, and fragmented oversight via a range of committees will not be good enough in this new, modern world.

Overall, and with all that said, I commend my noble friend and his colleagues for getting us to this point. I look forward to, and will support him in, completing the passage of this legislation in good order.

My Lords, it is a pleasure to follow the noble Baroness, Lady Stowell, and so many other fine speeches today. I should remind your Lordships of my interests. In particular, I have been working with GoBubble, which provides social media filtering technology. I was also a member of the Joint Committee on this Bill and was previously on the Select Committee on Democracy and Digital Technologies, chaired by the noble Lord, Lord Puttnam.

Right at the heart of this Bill are just two interrelated factors. First, there are bad actors: people who deliberately or carelessly do harm to others both in the real world and virtually, both physically and mentally. Our problem is how content from these bad actors interacts with the systems and processes in the online world that personalise and amplify that content. In 2021, 44% of all global spending on advertising was with Meta and Alphabet-owned businesses. Their platforms, such as Facebook, Instagram and YouTube, are machines with the objective of maximising engagement time on the platform in order to sell more advertising.

The machines have no ethics; they have business objectives. If that means feeding outrageous, disturbing or harmful content, so be it. If that means pushing at Molly Russell content that has now been implicated by the coroner in her death, so be it. If that means the corruption of children, self-harm or fraud, so be it. Whatever turns you on, keeps you engaged and keeps you on the platform is what the machines will push your way. This week, the Children’s Commissioner for England reported that one in five boys watch porn at least every day; that more than half of frequent users seek out violent sex acts; and that Twitter is the site where the highest proportion report seeing explicit sexual content.

The platforms are not all bad but the harms of manipulation and corruption are real and urgent. We must, and will, work together to get this Bill improved and passed by the summer. In doing so, our job with this Bill is to impose ethics on the algorithms used by platforms. This is less about bad content and more about systems. It is about content takedown and content suppression. It is as much about freedom of reach as freedom of speech. For too many people—especially women and girls, as the noble Baroness, Lady Morgan, mentioned—their freedom of expression is constrained by platforms because they are shouted down and abused. They need better protection.

Without change, vulnerable adults with learning difficulties will not be protected by this Bill. Without change, the corruption of truth and democracy by the likes of Trump and Putin will continue. Without change, the journalistic and democratic exemptions in the Bill will be exploited by the likes of Tommy Robinson to spread bile. Without change, content from the likes of Andrew Tate will continue to be amplified. His videos have been viewed more than 13 billion times on TikTok alone, including by any of our children whom we have allowed an account. Teachers, parents and grandparents cannot keep up with what is going on with children online; they need ongoing education and help. I am afraid that Ofcom is not cutting through with its media literacy duty. We must use this Bill to change that. We need to constrain the Secretary of State’s powers over Ofcom so that it is properly independent and give young people themselves more influence over the regulator.

There is much to do. This is as important a job of work as any I have been a part of during my 22 years in Parliament. I look forward to working with all Peers to deliver a Bill that prevents harm, criminalises abusers and overlays human ethics on to these machines of mass manipulation.

My Lords, I thank my noble friend Lady Kidron for her tenacious moral leadership on this issue. I remind noble Lords that, when we passed the Tobacco Advertising and Promotion Act, none of us predicted tobacco companies’ development and marketing of vapes with higher and more addictive nicotine content than that in cigarettes. It was a simple lesson.

A gap now in this Bill is the difficult issue of “legal but harmful”. We should not focus on the difficulty of defining this, but rather on the design and standards of algorithms that internet platforms use to commercial advantage, dodging any responsibility for what happens and blaming the end user.

Before the Government amended Clauses 12 and 13, category 1 service providers would have been forced to risk-assess across their sites and provide information on this in their terms of service, including how harmful content was to be managed. But this is now gone and as a result, the digital environment will not be detoxified as originally intended. What pressures, if any, were exerted on government by commercial and other sources to amend these clauses?

It matters that the Bill now treats people under 18 and over 18 very differently, because the brain’s development and peak addictive potential from puberty does not stop at 18. Those in their 20s are at particular risk.

The social media platforms act commercially, pushing out more content, including online challenges, as their algorithms pick up a keyword—whether spelled correctly or incorrectly—a mouse hovering over an image or a like response. Currently, platforms judge addiction and profit by the time spent on a platform, but that is not how addictions work. Addiction is the reward-reinforcing behaviour that evokes a chemical response in the brain that makes you want more. Hence the alcoholic, the gambling addict, the drug addict and so on keep going back for more; the sex addict requires ever more extreme images to gain stimulation; the user will not switch off access.

Those whose emotional expression is through abuse and violent behaviour find more ways to abuse to meet their urge to control and vent feelings, often when adverse childhood experiences were the antecedent to disastrous destructive behaviour. The unhappy young adult becomes hooked in by the images pushed to them after an internet search about depression, anorexia, suicidal ideation and so on. The algorithm-pushed images become compulsive viewing, as ever more are pushed out, unasked for and unsearched for, entrapping them into escalating harms.

Now, the duties in Clause 12 are too vague to protect wider society. The user should be required to opt in to content so that it can be followed, not opt out. The people controlling all this are the platform companies. They commission the algorithms that push content out. These could be written completely differently: they could push sources of support in response to searches for gambling, eating disorders, suicidal ideation, dangerously extreme sex and so on. Amending the Bill to avoid escalating harms is essential. Some of the harms are ones we have not yet imagined.

The platform companies are responsible for their algorithms. They must be made responsible for taking more a sophisticated, balanced-risk approach: the new technology of artificial intelligence could detect those users of their platforms who are at particular risk. In daily life offline, we weigh up risk, assessing harms and benefits in everything, filtering what we say or do. Risk assessment is part of life. That does not threaten freedom of speech, but it would allow “legal but harmful” to be addressed.

The Bill presents a fantastic opportunity. We must not throw it away.

My Lords, I declare my interest, as set out in the register, as a member of the advisory council of the Free Speech Union.

This is an important Bill. It has taken time to get to us, and rightly so. Many important requirements have to be balanced in it—the removal of illegal material, and the protection of children, as we have heard so movingly already today. But, as legislators, we must also have an eye on all elements of public policy. We cannot eliminate every evil entirely, except at unacceptable cost to other objectives and, notably, to free speech.

The Bill, as it was developing last summer, was damaging in many ways to that objective. At times I was quite critical of it, so I welcome the efforts that have been made by the new broom and new team at DCMS to put it in a better place. It is not perfect, but is considerably better and less damaging to the free speech objective. In particular, I welcome the removal of the so-called legal but harmful provisions, their replacement with a duty to empower users and the decision to list out the areas that this provision applies to, rather than leaving it to secondary legislation. I also welcome the strengthening of provisions to protect the right to free speech and democratic debate more broadly, although I will come on to a couple of concerns, and the dropping of the new harmful communications offence in the original Bill. It is clear, from what we have heard so far today, that there will be proposals to move backwards—as I would see it—to the original version of the Bill. I hope that the Government will be robust on that, having taken the position that they have.

Although the Bill is less damaging, it must still be fit for purpose. With 25,000 companies in its scope, it also affects virtually every individual in the country, so it is important that it is clear and usable and does not encourage companies to be too risk averse. With that in mind, there are areas for improvement. Given the time constraints, I will focus on free speech.

I believe that in a free society, adults—not children but adults—should be able to cope with free debate, if they are given the tools to do so. Noble Lords have spoken already about the abuse that they get online, and we all do. I am sure I am not unique in that; some if it drifts into the real world as well, from time to time. However, I do not look to the Government to defend me from it. I already have most of the tools to turn that off when I want to, which I think is the right approach. It is the one that the Government are pursuing. Free speech is the best way of dealing with controversial issues, as we have seen in the last few weeks, and it is right for the Government to err on the side of caution and not allow a chilling effect in practice.

With this in mind, there are a couple of improvements that I hope the Government might consider. For example, they could require an opt-out from seeing the relevant “legal but harmful” content, rather than an opt-in to see it, and ensure those tools are easy to use. There is otherwise a risk that risk-averse providers will block controversial content and people will not even know about it. It could be useful to require providers to say how they intend to protect freedom of speech, just as they are required to say explicitly how they will manage the Clause 12 provisions. Without that, there is some risk that freedom of speech may become a secondary objective.

To repeat, there has been considerable improvement overall. I welcome my noble friend the Minister’s commitment to listen carefully to all proposals as we take the Bill through in this House. I am happy to support him in enabling the passage of this legislation in good order soon.

My Lords, I welcome the Bill, but regret the time it has taken to arrive. To make the UK the safest place in the world to be online, it must be strengthened, and I will support amendments that would ensure greater protection for children through proper age assurance. The damage to children from exploitation by social media cannot continue. The state must regulate, using severe penalties, to force platforms to behave with greater responsibility as they cannot be trusted to self-regulate. The rise in suicide and self-harm and the loss of self-esteem are ruining young lives. The platforms must take greater responsibility; they have the money and the technology to do this but need stronger incentives to act, such as the promised executive criminal liability amendment.

Ofcom faces a formidable challenge in policing companies to adhere to its terms and conditions about content moderation. Heavy fines are not enough. Ofcom will need guidance in setting codes of practice from not only the three commissioners but NGOs, such as the Internet Watch Foundation, and an advocacy body for children to continually advise on emerging harms. A new regulatory regime to address illegal and harmful content online is essential but, having removed legal but harmful from the original Bill, we lost the opportunity to detoxify the internet.

Concentrating on the big platforms will miss the growth of bespoke platforms that promote other harms such as incel culture, a threat to women but also to young men. Incels, involuntarily celibates, use mainstream platforms such as YouTube to reel in unsuspecting young men before linking them to their own small, specialist websites, but these are outside the scope of category 1 provision and therefore any minimum standards. These sites include not only sexist and misogynistic material but anti-Semitic, racist, homophobic and transphobic items, and even paedophilia. One of the four largest incel forums is dedicated to suicide and self-harm. HOPE not hate, the anti-fascist campaign, has warned that smaller platforms used by the far right to organise and radicalise should be under the same level of scrutiny as category 1 platforms.

User empowerment features, part of the triple shield, such as options to filter out content from unverified users and abusive content, put the onus on the user to filter out material rather than filters being turned on by default. Ofcom must ensure a statutory duty to promote media literacy by the largest platforms as part of their conditions of service. The Bill should make children’s risk assessment consistent across all services, and should tackle the drivers of harm and the design of the service, not just the content.

I welcome the new offences targeting harmful behaviour, including epilepsy trolling, cyber flashing and the sending of manufactured deepfake intimate images without consent. Despite the Bill adding controlling or coercive behaviour to the list of priority offences, more needs to be done to protect women, one in three of whom has experienced online abuse. Ofcom must add a mandatory code of practice regarding violence against women and girls so that tech companies understand they have a duty to prioritise their safety.

The Bill must prevent the relentless promotion of suicide and self-harm that has destroyed the lives of young people and their families. I commend the bravery of Ian Russell, who is campaigning to prevent other deaths following the tragic suicide of his daughter, Molly. I back the amendments from the noble Baroness, Lady Kidron, to ensure that coroners and bereaved families can access social media content. I applaud all those campaigners who want to see the Bill implemented urgently, and I will work with other noble Lords to strengthen it.

My Lords, I support this important Bill, but with some concerns. As drafted, it does not go far enough to fully protect children and young people online. The consequences of the policies we decide in this Bill will affect the whole of society in decades to come.

I have been working on the online pornography issue for the last 10 years. In April 2017, this House passed legislation that required age verification for pornography websites to prevent children accessing them. We were promised that social media platforms would be included later on, but that did not happen. It is hard to believe that almost six years ago this House passed the Digital Economy Act, whose Part 3 was never implemented by this Government. So here we are, still debating age verification for pornography. This is simply unacceptable—a shocking failure of society. It is now time to act fast, and we must make sure that we do it right.

I am concerned that the Bill does not go as far as what was passed in 2017. Even if the Bill is passed, I do not believe that it will deliver age verification quickly. If Ofcom’s road map on the implementation of the Bill is to be believed, it could be three years before enforcement proceedings are issued against pornography websites that allow children to access them.

Research by the BBFC found that children as young as seven are innocently stumbling across pornography online and that 51% of all children aged 11 to 13 have watched pornography online—according to Barnardo’s, 54 million times. We are creating a conveyor belt of children addicted to porn, which will affect their long-term well-being and sexual behaviour.

A fundamental problem with the Bill is that it does not deal with pornography as a harm. The Government state that it is designed to ensure that what is lawfully unacceptable offline would also be unacceptable online. However, in respect of pornographic content, the Bill as drafted does not meet that goal. Material that is extreme and prohibited offline is widely available online. Evidence shows that consumption of extreme and prohibited material, such as content that sexualises children—and that includes adults dressing up as children—can lead on to the viewing of illegal child sexual abuse material and an interest in child sex abuse. It is not only children who are at risk: men who watch extreme and prohibited material online are more likely to be abusive towards women and girls.

What is needed is a stand-alone part of the Bill that deals with all pornographic content and sets out a clear definition of what pornography is. Once defined, the Bill should require any website or social media platform with content that meets that definition to ensure that children cannot access that material, because porn can be a gateway to other harms. Contrary to what some people believe, technology exists that can accurately age-verify a user without compromising that person’s privacy. The groundwork is done, and as more countries implement this type of legislation, the industry is becoming increasingly equipped to deal with age verification. France and Germany are already taking legal action to enforce their own laws on the largest adult websites, with several already applying age checks. There is no reason why this cannot be implemented and enforced within six months of the Bill becoming law. If that is too hard for the social media platforms, they can simply remove porn from their pages until they are ready to keep that harm away from our kids.

Childhood lasts a lifetime, and we have the opportunity to ensure that pornography is not a harm inflicted on our children. We owe it to them. I declare an interest as vice-president of Barnardo’s.

My Lords, I declare an interest as a series producer of online and linear content. I, like many noble Lords, can hardly believe that this Bill has finally come before your Lordships’ House. It was in 2017, when I first joined the Communications and Digital Committee, that we started to look at online advertising. We went on to look at regulating the internet in three separate inquiries. I am pleased to see some of those recommendations in the Bill.

It is not surprising that I support the words of the present chair of the committee, the noble Baroness, Lady Stowell, when she said that the Secretary of State still has far too many powers over the regulator. Draft codes of practice, in which Ofcom can give the parameters and direction for the tech companies, and the review of their implementation, are going to be central in shaping its terms of service. Generally, in democracies, we are seeing regulators of the media given increasing independence, with Governments limiting themselves to setting up their framework and then allowing them to get on with the task at hand. I fear the Bill is not doing that. I understand that the codes will be laid before Parliament, but I would support Parliament having a much stronger power over the shaping of those regulations.

I know that Labour supports a Select Committee having the power to scrutinise this work, but having served on the Communications and Digital Committee, I fear that the examination of consultations from Ofcom would monopolise its entire work. I support the pre-legislative committee’s suggestion of a Joint Committee of Parliament, whose sole job would be to examine regulations and give input. I will support amendments to this effect.

I am also worried about Clauses 156 and 157. I listened to the Minister when he said that amendments to the Secretary of State’s powers of guidance will be brought before the House and that they will be used only in exceptional circumstances. However, the list of subjects on which I understand the Minister will then be able to intervene is still substantial, ranging from public safety through economic policy and burdens to business. Are the Government prepared to consider further limiting these powers to intervene?

I will also look at risk assessments in the Bill. They need to go further than illegal content and child safety. The empowerment lists in Clause 12 are not risk assessed and do not seem to have enough flexibility for what noble Lords know is an ever-changing world of harms. The volume of online content means that moderation is carried out by algorithms. During the inquiries in which I was involved, we were told repeatedly that algorithms are very bad at distinguishing humour and context when deciding on harmful content. Ensuring that the platforms’ systems moderate correctly is difficult. There was a recent case of that: the farcical blocking by Twitter of the astronomer Dr Mary McIntyre, whose account was suspended because her six-second video of a meteor shower was mistaken by the Twitter algorithms for a porn video. For weeks, she was unable to get any response from Twitter. Such mistakes happen only too frequently. Dr McIntyre’s complaint is only one of millions made every year against the tech companies, for being either too keen or not keen enough to take down content and, in some cases, to block accounts. So the Bill needs to include a risk assessment which looks at the threat to free speech from any changes in those systems. Ofcom needs to be able to create those risk assessments and to produce annual reports which can then be laid before a Joint Committee for Parliament’s consideration. That should be supported by an ombudsman.

I would also like to see the definition of safety duties on platforms to take down illegal content changed from “reasonable grounds” to the platform being aware that the content is “manifestly illegal”—and, if possible, for third parties, such as the NCA, to be involved in the process. That will reduce the chance of chilling free speech online as much as possible.

I am also aware that there has been concern over the duties to protect news publishers and journalistic content. Like other noble Lords, I am worried that the scope in respect of the latter is drawn too widely in the Bill, and that it covers all content. I would support amendments which concentrate on protecting journalism in the public interest. The term “in the public interest” is well known to the courts, is present in Section 4 of the Defamation Act, and is used to great effect to protect journalism which is judged to be in the public interest.

I welcome the Bill after its long journey to this House. I am sure that the hard work of fellow Peers and collaboration with the Minister will ensure that it leaves this House in a clearer, more comprehensive and safer state. The well-being of future generations of internet users in this country depends on us getting it right.

My Lords, it is an enormous privilege to follow so many powerful speeches. My second daughter was born in the year Facebook launched in the UK and Apple sold its first iPhone. Today she is 15; she has lived her whole life in a digitally enabled world. She has undoubtedly benefited from the great things that digital technology brings, but, throughout that life, she has had no meaningful legal protection from its harms.

A number of noble Lords have referenced the extraordinarily moving and disturbing briefing that Ian Russell and his lawyer, Merry Varney, gave us on Monday. When I went home from that briefing, first, I hugged my two teenage girls really close, and then I talked to them about it. My 15 year-old daughter said, “Mum, of course, I know about Molly Russell and all the awful content there is on social media. Didn’t you realise? When are all you adults going to realise what’s going on and do something about it?” The Bill is important, because it is the beginning of us doing something about it.

It is also a huge Bill, so we need to be careful not to let perfect be the enemy of the good. Like other noble Lords, I urge this House to focus on the critical areas where we can improve this already much debated and discussed Bill and try to resist the temptation to attach so many baubles to it that it no longer delivers on its core purpose of protecting our children online. So, like others, I will focus my remarks on three structural changes that I hope will help make the Bill more effective at driving the positive changes that, I think, everyone in this House intends: first, the consequences for senior managers of not complying with the legislation; secondly, how compliance is defined and by whom; and, finally, which services are included.

To change digital platforms and services to protect children is not impossible—but it is hard, and it will not happen by itself. Tech business models are simply too driven by other things; development road maps are always too contested with revenue-raising projects, and competition for clicks is just too intense. So we need to ask ourselves whether the incentives in the Bill to drive compliance are strong enough to counter the very strong incentives not to.

It is clear that self-regulation will not work, and relying on corporate fines is also not enough. We have learned in other safety-critical industries and sectors that have needed dramatic culture change, such as financial services, that fines alone do not drive change. However, once you name an individual as responsible for something, with serious consequences if they fail, change happens. I look forward to the government amendment that I hope will clearly set out the consequences for named senior managers who do not deliver on their overall online safety responsibilities.

The second area I highlight is how compliance is defined. Specifically, the powers that the Bill grants the Secretary of State to amend Ofcom’s proposed code of conduct are far too wide. Just as with senior tech managers, the political incentives not to focus on safety are too strong. Almost every Minister I have ever met is keen to support tech sector growth. Giving the Secretary of State the ability to change codes of conduct for economic reasons is asking them to trade off economic growth against children’s safety—the same trade-off that tech companies have failed to make over the last 15 years. That is not right, it is not fair on the Ministers themselves, and it will not deliver the child protections we are looking for.

The third area I will cover—I will be very brief—has been highlighted by the noble Baroness, Lady Kidron. It is important that we capture all the services that are accessed by children. If not, we risk creating a dangerous false sense of security. Specifically, I am worried about why app stores are not covered. In the physical world—I say this as an erstwhile retailer—retailers have long come to terms with the responsibilities they bear for ensuring that they do not sell age-restricted products to children. Why are we shying away from the same thing in the digital world?

There are many other things I would support, not least the amendments proposed by the noble Baroness, Lady Kidron. I finish by simply saying that the most important thing is that the Bill is here. We need to do this work—our children and grandchildren have waited far too long.

My Lords, this is indeed a huge, complex and courageous Bill which deserves widespread support. Despite some welcome government amendments during its passage in the other place, there are residual concerns about guarantees of freedom of expression and access to information, as well as the degree to which the regulator, Ofcom, is independent of government control.

It is widely acknowledged by the Government themselves and the majority of those who have spoken to the Bill that the right to free speech is a fundamental aspect of our democracy, and that any restriction must be fully justified in the public interest. Public interest includes the freedom to access unwelcome, unpopular and even offensive material, if only to be able to refute it. It is also accepted that a functioning democracy needs new ideas and robust debate. That said, it is a fine and difficult line to draw between offensive material and illegal content. In their efforts, the Government have sought to protect above all the safety of children.

I start with a presumption in favour of free speech and a multiplicity of voices. Clauses 18 and 28 state that providers must

“have particular regard to the … users’ right to freedom of expression”

and to protecting users from breaches of any laws relating to privacy. This would be achieved by rigorous impact assessments of safety measures and policies, any infringements of which must be made publicly available. However, the definition of democratically important material as information

“specifically intended to contribute to democratic political debate in the United Kingdom”

remains vague, and other strict requirements on protecting children in the Bill could condemn offensive but necessary democratic content.

Clause 160 refers to false information intended

“to cause non-trivial psychological or physical harm”.

It may, in many cases, be entirely obvious when such harm is intended, but not in all cases. On whom does the burden of proof lie and what recourse does an individual have to appeal false accusations?

The stricture that democratically important content be preserved is by no means fully guaranteed by the following powers set out in the Bill. There is a potential danger of undue restriction that lies in the degree of control from the Secretary of State and his or her relationship with Ofcom; the terms and conditions of service for category 1 providers; the options, or lack of them, for user control of online material; and the role of Parliament.

Draft codes of practice are to be submitted to the Secretary of State, who could require Ofcom to modify codes in the interests of national security or public safety. The Secretary of State will pass any statement on strategic priorities to Ofcom, but parliamentary approval would be by means only of the negative resolution procedure.

The Secretary of State can issue guidance and directions to Ofcom, which in turn has a crucial role in acting against a provider that is not complying with the requirement to fulfil duties under the Act, including the imposition of fines of up to £18 million and “business disruption measures”—in other words, outright censorship. Although such drastic action could occur only in the case of a breach of the terms of service, there would be no restriction on taking down content to comply with other duties—for example, if it was judged that the content might be “likely” to be accessed by children. This, it is feared, would encourage providers to play safe. Furthermore, the terms and conditions can be altered at will by the provider.

The age verification process would necessarily require the user to register with a provider, preventing any casual access by adults. Furthermore, to remove unnecessary barriers to information, the controls available to the user should be a genuine option and not imposed by default.

This is a truly important Bill and I congratulate the authors and campaigners, as well as the Government, on bringing it to this advanced stage. I nevertheless believe that it could be further improved to ensure that the most liberal interpretations of online freedom of expression remain at the heart of our democracy.

My Lords, the Secretary of State, Michelle Donelan, has acknowledged that protecting children is the very reason that this Bill exists. If only the Government had confined themselves to that crucial task. Instead, I worry that the Bill has ballooned and still could be a major threat to free expression of adults. I agreed with much of what the noble Baroness, Lady D’Souza, just spoke about.

Like some other noble Lords here, I am delighted that the Government have dropped the censorious “legal but harmful” clauses. It was disappointing to hear Labour MPs in the other place keen to see them restored. In this place, I have admired opposition resistance to assaults on civil liberties in, for example, the Public Order Bill. Perhaps I can appeal for consistency to be just as zealous on free speech as a foundational civil liberty. I urge those pushing versions of censoring “legal but harmful” for adults to think again.

The Government’s counter to many freedom of expression concerns is that free speech is protected in various clauses, but stating that service providers must have regard to the importance of protecting users’ rights of freedom of speech is incredibly weak and woolly, giving a second-class status whencontrasted with the operational safety duties that compel companies to remove material. Instead, we need a single comprehensive and robust statutory duty in favour of freedom of expression that requires providers to ensure that free speech is not infringed on by measures taken to comply with other duties. Also, free speech should be listed as a relevant duty for which Ofcom has to develop a code of practice.

The Bill requires providers to include safety provisions for content in their terms of service. However, no similar requirement for free speech exists. It seems ironic that a Bill that claims to be clipping the power of big tech could actually empower companies to police and censor legal material in the name of safety, via the commercial route of terms and conditions.

The Government brush off worries that big tech is being encouraged to limit what UK citizens say or read online by glibly asserting that these are private companies and that they must be free to develop their own terms of service. Surely that is disingenuous. The whole purpose of the legislation is to interfere in private companies, compelling them to adhere to duties or face huge penalties. If the Government do not trust big tech with users’ safety, why do they trust them with UK citizens’ free speech rights? Similarly, consider the user empowerment duties. If users ask that certain specified types of legal content are blocked or filtered out, such as hate or abuse, it is big tech that has the power to decide what is categorised under those headings.

Only last year, amendments put forward in this House on placing convicted sex-offending trans prisoners on the female estate were labelled online as hate-fuelled, transphobic abuse. However, with the ability to hear all sides of the debate online, and especially in the light of recent events in Scotland around the Gender Recognition Act, more and more people realise that such views are not hate but driven by concerns about safeguarding women’s rights. Would such a debate be filtered out online by overcautious labelling by big tech and the safety duties in its Ts and Cs?

Finally, like others, I am worried that the Secretary of State is given too much power—for example, to shape Ofcom’s codes of practice, which is a potential route for political interference. My concerns are fuelled by recent revelations. In the US, Elon Musk’s leaked Twitter files prove that, in the run-up to the 2020 election, Joe Biden’s presidential campaign routinely flagged up tweets and accounts that it wanted removed, influencing the suppression of the New York Post’s Hunter Biden laptop exposé. Here in the UK, only this week, a shocking Big Brother Watch report reveals that military operatives reported on online dissenting views on official Covid lockdown policies to No. 10 and the DCMS’s counter-disinformation unit, allowing Whitehall’s hotlines to giant media companies to suppress this legal content. Even the phrase “illegal” in the Bill can be politically weaponised, such as with the proposal to censor content allegedly promoting small boat crossings.

Free speech matters to democracy, and huge swathes of this Bill could threaten both unless we amend it appropriately.

My Lords, I begin by thanking the House of Lords Library and various organisations for their briefings on the Bill. One of the ways I want to approach this discussion is to talk about where I think there is consensus and where there will need to be further debate. Of course, as many noble Lords have said, there will be incredible trade-offs, and there are many issues people feel strongly about.

There is consensus on the issue of protecting children, and I pay tribute the noble Baroness, Lady Kidron, for her work over many years on this, as well as that of other noble Lords. There is consensus on making sure that, where companies have terms and conditions, they actually enforce them. We have to be aware of that. There is obviously consensus on tackling sites promoting suicide and other self-harm measures.

Where there are concerns on my part is around freedom of expression. Quite often, everyone says that they are in favour of freedom of expression until they are offended, and then they find a reason not to be. There are also concerns about the Secretary of State’s power to intervene and influence the online safety regime. I agree with other noble Lords that Ofcom should remain independent from the Secretary of State but I am aware of public choice theory; institutions could be captured by political bias, so we have to be careful about that.

Noble Lords will submit amendments to bring back into the Bill the issue of harm to adults, but I would add a note of caution: how subjective is “harm”? A quick example is how Muslims reacted to the Danish cartoons. Some would have found them distasteful; some would have said they were harmed by them. Does that mean they should have been banned or taken down? How do we face these challenges in a free society? Can we be as technologically neutral as possible? Can we be careful of rent-seeking by organisations that will peddle their products and claim that they have the best age-assurance technology or something like that? Although we want the solution, let us make sure there is a thriving market to ensure that we get the better solutions. Regulation always lags developing technology; we will want this Bill to be as dynamic as possible, but that may require some secondary legislation, which I know many noble Lords are often sceptical about.

I really want to focus on unintended consequences, not because I am against the Bill but to warn of the difficult issues we are going to have to look at. First, companies will be acting as police but may take an overcautious approach. In the other place, and here, people talked about criminal liability with some of the directives, but think about the impact of criminal liability on other legislation—for example, financial companies when it comes to politically exposed persons. We all know the unintended consequences of that from being overcautious.

Adult verification is another issue. Whatever we think about pornography, it is legal. What people will be concerned about is whether they can verify their age in an anonymous way. They will be concerned whether their data will be used later to blackmail them; will verification drive users to the dark web? Not everything on the dark web is illegal. Some authoritarian regimes such as Russia, China, Saudi, Iran and Venezuela have tried to ban the Tor Browser, but are we going to follow them? There are also ways around it. One way that terrorists have been known to share information was to create an email account, share the password and username, and leave messages for each other in the drafts folder. How do we tackle that without impacting on all users of the internet? How do we also make sure that firms enforce their terms and conditions and, in doing so, do not water them down?

I know that there are many questions, but I hope that we will work through them, and others that have been raised, so that we have a Bill that is proportionate, workable and effective, and that protects children, women and girls, and vulnerable adults.

My Lords, I generally welcome the Bill and I pay tribute to the noble Baroness, Lady Kidron, for the great work she has done. In the Bill, I particularly welcome the movement towards greater protection for children than we have had hitherto. I share the concern of the noble Baroness, Lady Benjamin, that there may be difficulties, including the age-verification system, which was raised by the noble Lord, Lord Kamall. I am in favour of age verification and I would like to see it implemented quickly. I would also like the Minister to assure us that, having waited so long, if we find that there are loopholes in it, we can find some mechanism to fill those loopholes fairly quickly—perhaps a commitment to using secondary legislation rather than having to wait for so long, as we have done in the past.

My second concern relates to Clause 12, which the right reverend Prelate the Bishop of Oxford raised and which the nobles Baronesses, Lady Hollins and Lady Finlay, also spoke to, on the protection of adults from risk and harm. I do not think enough attention has been paid to what is happening with pornography and with mental health. Here I declare an interest as the founder and vice-chair of an All-party Group for the Twelve Steps Recovery Programme from Addiction. Addiction is not just about alcohol. AA started the 12-step programme but it has been extended over the years to a whole range of other addictions—not least drugs, gambling and overeating, and in particular it is growing quite extensively in the sexual field. We have a range of 12-step programmes operating, including for SLA—sex and love addiction—and sexual addiction. As to the latter, an ever-increasing number of people are in grave trouble due to the effects of pornography, not just solely on themselves but consequently the rest of their family in a whole range of different ways.

It is quite interesting that of the number of people watching pornography—mainly men—between midnight and 4 am is the time when most porn sites are being visited. These are affecting people mentally, affecting their work and affecting their relationships. The Bill as it stands does not address that issue sufficiently well. They had a go at it in the Commons and were persuaded that the approach was incorrect. Pornography is growing. We must protect the freedom of speech and what we circulate, but equally we must protect standards. In turn, we must make sure that we are not creating in certain areas a decadence that we have not had before that is damaging to society.

I hope that we might look again at Clause 12 and try to find a way for some accommodation to be found between the Government’s viewpoint and the views being expressed by people such as the noble Baroness, Lady Finlay. It is important that we do so; if not, we will have to start campaigning privately. If we cannot get it through law, we will have to bring together those concerned about pornography and look for ways to bring to the attention of people that it must be drawn to a halt or at least diminished, given the extent and pace at which it is growing at present. I think it can be done. We have a dry January; why should we not, in the month of December, encourage people not to engage in pornography? At least it would capture attention. If we want to have a better society, we should be diminishing this practice rather than growing it.

My Lords, it is an honour to follow the intriguing suggestion of the noble Lord, Lord Brooke, about December—which I will not repeat at this moment. I declare my interest as a former head of public affairs at the BBC who heavily lobbied this House in 1995 and 1996 to bring about the Broadcasting Act which set BBC online on its way. I am proud to say that BBC online remains a beacon of responsible content to show the rest of the world. I am also co-chair of the all-party group on media literacy and patron of Student View, which works in over 100 schools around the country to deliver media literacy.

In the original draft Bill, media literacy was not a central point but an important point of commitment. It has since been removed from the final legislation in front of us. As the Minister said in his introduction, there are multiple provisions in the legislation which cater for enabling adults to make sensible use of their media journey. However, there is very little, other than protections for children, to enable children to make intelligent understanding of their media journey.

According to the National Literacy Trust, in its assessment a few years ago, only 2% of children had the critical thinking skills necessary to be able to distinguish between fact and fiction online, and 90% of teachers say they are in favour of media literacy but feel that they do not have the skills to be able to teach it. They also feel that the vast majority of children they teach who discuss media issues consistently in the classroom do not understand the difference between truth and misinformation.

I want to keep it simple and say two things to the Minister and one to the Opposition. First, to the Minister, given the level of fines which should become apparent as a response to abuse of this legislation, money will be available to empower media literacy programmes inside and outside of schools. There should be no excuse that there is no money; the money in fines should go not just towards Ofcom’s costs but towards improving the capability of the next generation to navigate the media landscape. Will the Minister and the Government consider that?

It is obvious that media literacy is not in this Bill now because the Government argued it was essentially an education matter. In that case, will the Minister commit the Government—as he speaks for the Government —to bringing forward a media literacy education Bill before the next election? If it is not possible and there is to be a Labour Government after the next election, will the Labour Front Bench commit to bringing forward a media literacy education Bill, rather than simply letting this issue drift into the long grass? The noble Lord, Lord Stevenson, can answer that directly at the end and make a commitment on behalf of the Labour Front Bench we can all hold him to account on.

There also needs to be substantial support for teaching teachers to understand and navigate a forest that they do not necessarily know how to enter or exit. That should be part of teacher development and support. Can we also consider the costs of misinformation and how it is damaging our social fabric? Can the Minister request of the Treasury that it brings forward cost assessments of the damage of misinformation?

My Lords, the internet is in so many ways a wonderful new continent, discovered only in my adult lifetime. But like older territories it has not been the unadulterated bastion of freedom and equality that its pilgrim and founding mothers and fathers would have dreamt of. While it has created enormous opportunities for expression, interconnection and learning, it has also allowed the monetising of hate and abuses of power up to and including serious criminal offences to the detriment of children and other vulnerable people.

To a large extent, big tech corporations with monopolistic power have become the new imperium, colonising this new continent without the desire, expertise, independence or accountability to properly regulate of police it. Further, as the technology has moved at a breath-taking pace, national Parliaments and Governments have lagged behind in even fulfilling their basic duties to resource the enforcement of existing criminal law online or, indeed, to ensure sufficient tax raising from the new emperors who can employ former senior politicians for their lobbying, influence national elections via their products and seek to further their hegemony even beyond our shrinking, burning planet.

Alongside corporate and governmental neglect, there have been abuses of people’s rights and freedoms by state and non-state entities around the world. It is very possible to be too permissive in allowing private abuse and simultaneously too interventionist so as to abuse political power. Noble Lords would be wise to hold on to that duality as they undertake the most anxious line-by-line scrutiny of this Bill. With that in mind, given the length, novelty and complexity of this draft legislation, I regret the short time allocated today. The sheer number of speakers should have justified two days of Second Reading, if only to prevent de facto Second Reading speeches in Committee.

Legislation is required and the perfect should not be the enemy of a first attempt at the possible. However, given the fast developing and global landscape, further legislation will no doubt follow. Ultimately, I believe that His Majesty’s Government should seek to pioneer a global internet and AI treaty in due course—or at least, a Labour Government should. For one thing, the black boxes of advanced algorithms must be made transparent and subject to legal control so as not to entrench inequality, discrimination and hate.

That may sound ambitious, but it will take that kind of ambition—the kind of ambition that we saw in the post-war era to establish some notion of an international rule of law and fundamental rights and freedoms in the real world truly to establish a proper rule of law with protected human rights in the virtual one. At the very least, what is already criminal should be policed online. However, we should be wary of outsourcing too much of that policing role to corporations without at least binding them more directly to the free expression and personal privacy protection duties that bind Ofcom, police and prosecutors under the Convention on Human Rights.

Furthermore, we should look again at tightening up over-broad public order offences, such as causing alarm or distress under Section 5 of the Public Order Act, before allowing them to constitute priority illegal content for proactive removal. Conversely, will the Minister confirm that, for example, euphemistic sex for rent adverts targeting poor, vulnerable women, in particular, will be a priority under Section 52 of the Sex Offences Act? As this experiment in national regulation of an international phenomenon develops, the power of the Executive to direct Ofcom sets a dangerous politicising precedent for regimes elsewhere. They should be removed.

My Lords, I am pleased to add my name to the Second Reading of such an important but complex Bill. There is very little time to speak on such positive and necessary legislation—200-plus clauses and 17 Schedules. But I know from experience of this Chamber that we will scrutinise every full stop to make it far better than when we received it.

While we must recognise that companies should have safeguarding policies and penalties in place, we should also never forget the lives of our young children, those who have been taken and the voices of bereaved families. They should be in the veins of this Bill right through to the end.

I say this as I remember that, in the trial following my husband Gary’s murder 15 years ago, some of the evidence shown was horrific violence downloaded on the offenders’ phones. The content was so horrific that the judge laid it on file for whenever they had parole hearings. It showed injuries identical to those Gary received—kicking and punching injuries that those on trial thought were very funny, even when they watched it in the courtroom from the dock. I now have three daughters who suffer from post-traumatic stress disorder. I have to ensure that they never forget their father, and do not just remember him lying on the ground that August evening.

In my role as Victims’ Commissioner, for seven years I had the pleasure and honour of listening to many victims and survivors of horrific crimes. Time is short but I would like to mention the mother of Breck. Her son was beautiful, bright and bubbly, only to become removed from any emotion and from his family. Breck was groomed online by an 18 year-old man who ran the internet gaming server that Breck and his schoolfriends used every day. Our children are most likely using Xbox consoles and have contact with these people from their own homes. The groomer used lies, manipulation and false promises to gain Breck’s trust. Despite many attempts by the family to stop Breck’s communication with his groomer, he ignored the safety advice he had been given by his family and was sadly lured to the groomer’s flat. On 17 February 2014, Breck was brutally murdered by this online groomer. So, the noble Baroness, Lady Kidron, and all those bereaved families who have worked tirelessly to make sure that the Bill has teeth and power to protect their loved ones, have my full support.

I thank Barnardo’s, the NSPCC, Refuge and the Centre for Women’s Justice for their briefing. My interest will be in the work and roles of the Victims’ Commissioner and the Domestic Abuse Commissioner, and the code of practice to protect the VAWG sector in light of women being 27 times more likely to be abused and harassed. I will be supporting my noble friend Lord Bethell’s amendment on age verification, regarding pornographic content that children can access. We must also ensure that, while this is for the professionals and absolutely about penalising the guilty, we must never forget the families who have to live, every day, through the hardship and heartbreak of losing a loved one. We must ensure that there is a channel to protect their families and support them to have a better life in memory of their loved ones.

My Lords, I welcome the Bill but it is very long overdue. The Second Reading of my Private Member’s Bill was on 28 January 2022. It sought to commence Part 3 of the Digital Economy Act 2017. This would have ensured that age verification of pornography was applied to pornographic websites. It is disappointing that the Bill has not progressed and Part 3 of the Digital Economy Act—a vital tool that could have prevented children accessing online pornography —is still not being implemented.

I remind your Lordships that in February 2016—now seven years ago—the Government said:

“Pornography has never been more easily accessible online, and material that would previously have been considered extreme has become part of mainstream online pornography. When young people access this material it risks normalising behaviour that might be harmful to their future emotional and psychological development.”

Nothing has changed in seven years; the threat is still as real today as it was then. All that has changed is that, during that seven-year delay, more children’s lives have been harmed. This cannot be allowed to continue.

I welcome that the Government have listened to the concerns about access to commercial pornographic websites and have, as a result, introduced Part 5 of the Bill. However, I believe more changes are needed to make it effective. Today, I raise only three of them. First, the Bill needs a more robust definition of pornography, based on the 2017 Act. Secondly, the Bill needs to cover all pornography services. Clause 71 says that only if “a service has links” with the UK will it be required to comply with the duties in Part 5, where “links with” means only pornographic websites which have a significant number of UK users or have the UK as a target market.

I ask the Minister: what will be considered significant? Is it significant in terms of the total UK adult users who could use the service, or significant in terms of potential global users? Either way, it seems to me that there could be pornographic websites accessed in the UK that are not required to have age verification to protect those aged under 18 from accessing this content. I doubt that this is what parents expect from this flagship Bill.

Finally, the Bill needs a commencement clause for age verification. Far too many young people have grown up without the protection that age verification could have brought in, if the 2017 Act had been implemented. We have heard others refer to this. There should be no further delay and the Government should demonstrate the urgency that they spoke of when they announced in October 2019 that they would not be implementing the 2017 Act. Age verification needs to be implemented as soon and as quickly as possible, and that is why a commencement date clause is needed in the Bill.

We cannot countenance these measures not being brought into force, or even a long delay of three or more years. The children’s charity Barnardo’s, which has already been referred to, has estimated that children have accessed pornographic content almost 55 million times since the Government announced in 2019 that they would be bringing forward the Online Safety Bill as an alternative to Part 3 of the Digital Economy Act. This cannot be allowed to continue. That is why we need to get the Bill right and ensure that robust age verification, that applies to all websites and social media accessed in the UK, is brought in as quickly as possible. I look forward to exploring these issues further in Committee.

My Lords, I begin with a brief refection on my 26 years or thereabouts on the internet, which saw me hand-coding my first website in 1999 and sees me now, I believe, as one of the few Members of your Lordships’ House with a TikTok account. I have had a lot of good times on the internet; I have learned a lot, made a lot of friends and built political communities that stretch around the world in ways that were entirely impossible before it arrived. That tells you, perhaps, that I think we should be careful in this debate about the diagnosis of the source of undoubted issues that the Bill seeks to address. It appears that some would like to wave a magic wand and shut it all down if they could—to return to some imagined golden age of the past, perhaps when your Lordships’ House was harrumphing loudly about the damaging effects of this new-fangled television.

While we are talking about young people, I have serious questions about the capacity of this House to engage with this debate. Yes, we did well in getting online during lockdown, even if we sometimes caught a glimpse of the grandchildren or great-grandchildren pressing the buttons so that their elders could speak in the House. They are the same generation; we are looking to take control over what they are doing right now. I invite noble Lords to keep that in mind as this debate proceeds.

I put it very seriously to your Lordships’ House that before we proceed further, we should invite a youth parliament into this very Chamber. We should listen to that debate on this Bill very carefully. On few subjects is the obvious need for votes at 16, or even younger, more obvious—the need for the experts by experience to be heard. They have the capacity to be the agents and to shape their own world, if their elders get out of the road.

I have no doubt that those young people would tell us that they suffer harm on the internet, with awful violent pornography and dangerous encouragements to self-harm and suicide. There need to be protections, while acknowledging that young people cannot be walled off into a little garden of their own. But I am sure young people would also say we need to address much wider issues, to build resilience and provide an education that encourages critical thinking rather than polished regurgitation of the facts. I would associate myself with the remarks of the noble Baroness, Lady Merron, and, indeed, the noble Lord, Lord Hastings of Scarisbrick, among others, about the need for media education. But how do we encourage critical thinking about the media when we are also encouraging regurgitation of the right results for the exam—that you have to repeat these 10 points? The two things do not fit together.

In a stairwell discussion with a Member of your Lordships’ House who is not a digital native—and I point out that nobody in this debate is a digital native—but is certainly someone with much experience over decades, they reflected on the early hopes of the internet for democracy, for access to information and for community. They suggested it was inevitably a lost age; I do not agree. Political decisions and choices allowed a handful of multinational companies—mostly tax dodging, unaccountable to shareholders, now immensely rich—to dominate. That is not unique to the internet; that is what the political decisions of neoliberalism over the past decades have done to our food supplies, our retailing systems, our energy, our medicines and, increasingly, our education system. Far right, misogynistic, racist, homophobic and transphobic voices have been allowed to take hold and operate without challenge in our mainstream media, our communities, our politics and on the internet.

Financial fraud is a huge problem on the internet and, hopefully, this Bill might address it; but financial fraud and corruption is a huge problem across our financial sector, as indeed is the all-pervading one of gambling. The internet is a mirror to our society, as well as a theatre of interaction. The idea that we can fix our societies by fixing the internet is a fallacy; for many with commercial and political interests, it is a comfortable one that deflects political challenges they would rather not face.

My Lords, if a child goes to the Windmill club, the most famous strip club in Soho, the bouncers will rightly turn them away, no ifs, no buts: no entry, full stop. If a child tries to buy a knife on Amazon or to place a bet on, it will be the same story: you need proof of age. But every day, millions of children in this country watch pornography in their homes, at schools, on the bus, on devices of all kinds, without any hindrance at all. The Children’s Commissioner makes it really clear that this is not just raunchy pornography like in the old days of Razzle magazine. These are depictions of degradation, sexual coercion, aggression and exploitation, disproportionately targeted at teenage girls. As Dame Rachel de Souza said:

“Most of it is just plain abuse”.

The effects of this failed experiment are absolutely disastrous. The British Board of Film Classification says that half of 11 year-olds have seen porn, and according to the NSPCC, a third of child abuse offences are now committed by children. The answer is straight- forward in principle: we need to apply the rules on age verification for porn that exist in the real world to the online world. We need to address this harm immediately, before any more damage is done—before there is any metaverse or any more technology to spread it further.

I know that the Minister, the Secretary of State and the Prime Minister all broadly agree with this sentiment, and that is why the Bill has:

“A duty to ensure that children are not normally able to encounter content that is regulated provider pornographic content in relation to the service (for example, by using age verification).”

But this vague power simply starts a long process of negotiation with the porn industry and with tech. At a very minimum, it will require a children protection consultation, a child’s access assessment, a guidance statement, an agreement on child protection guidance and codes, secondary legislation, parliamentary approval of the Ofcom child protection code, monitoring and engagement, engagement on the enforcement regime, test cases in the courts—and so on.

I appreciate that we are creating laws flexible enough to cope with technological evolution and I totally support that principle, but we should not reinvent the wheel. We tried that 30 years ago when the online porn industry started, and it failed. We need one regime for the real world and for the online world. This is an opportunity to send a message to the tech industries and to the British people that we mean business about protecting children, and to put Britain at the vanguard of child protection regulation.

I want to see this Bill on the statute book, and I am very grateful for engagement with the Minister, the Bill team and all those supporting the Bill. I look forward to suggestions on how we can close this gap. But if we cannot, I will table amendments that replace Part 5 of the Online Safety Bill with Part 3 of the Digital Economy Bill—a measure that has considerable support in another place.

My Lords, I draw attention to my interests as in the register, and I thank all those who have sent briefing notes. I do not think any of us underestimates the scale of what we have to achieve in the coming weeks.

Just this morning, I read an article in which Dame Rachel de Souza was quoted as saying that this Bill is an “urgent priority”. The article described a 12-year-old girl being strangled by her boyfriend during her first kiss:

“He had seen it in pornography and thought it normal.”

This afternoon, many figures have been quoted on children’s access to pornography, and each figure is deeply disturbing. I listened very carefully to the words of the noble Lord, Lord Bethell; he made a compelling argument, and I will strongly support any amendments he brings forward.

Along with age verification we need better education for children on the use of the internet, and on appropriate relationships. We have to be very aware of content that pushes weight loss, body image and appearance, appearance-improving ads, and images that have been altered.

I would like to concentrate on violence against women and girls, and I thank all the women who have been in touch with me. We must recognise the threat that women are under. Women are 27 times more likely to experience abuse—that is one in three women. Some 62% of young women have experienced abuse. Four out of five cases of online grooming involve girls, and 120 cases are being reported every week. To bring that closer to home, 93% of female MPs have experienced online abuse just for doing their job or having an opinion. I am not trying to stifle free speech. Yes, we have to accept criticism and challenge, but not abuse and threats. I really worry about us developing a social norm of trying to shut down women’s voices. I am mindful that we in this Chamber and in another place have a high degree of protection that women in the outside world do not. We live in a world where a rape threat against a woman can potentially remain online, but a woman talking about menstruation can be told that it breaches guidelines. The balance is not yet right.

I offer my support to my noble friends Lady Hollins and Lady Finlay regarding vulnerability; it does not end at the age of 18. We have to think about those who are vulnerable. The empowerment tools do not go far enough, and we need to explore that in more detail in Committee.

Finally, I pay tribute to my noble friend Lady Kidron. I thank her for her work and for arranging a meeting with the Russell family, and I thank Ian Russell for being here today. That meeting fundamentally strengthened my view on what we need to do. It was shocking to hear what various platforms deemed to be acceptable. I naively expected them to be better. It completely ignores those who are in a vulnerable position, who can be constantly bombarded with abusive images. I have spent the last couple of days trying to put into words my feelings on listening to what Molly went through. It is horrendous, and while we applaud the resilience and bravery of the Russell family, this is our chance to do so much more and to protect internet users.

My Lords, I want to talk about the link between online financial scams and mental health. People who have problems with their mental health are, for a variety of reasons, more vulnerable to such scams. They are three times more likely to be the victims of online financial scams than those people without such problems and, in reflection, people who are victims of online scams are much more at risk of having mental health problems.

I understand and have been impressed by the contributions to this debate about the problems faced by children and women, but I think, given the opportunity of the Bill, it is important that this issue is addressed. The results of such scams lead to much misery. They destroy families and, in all too many cases, lives. So the question is: can, and how should, the Bill address this problem? This is the Bill on the stocks and the one in which we must address this issue.

There is no doubt that scams are a big and growing problem. Anyone can fall victim to such a scam, but people with mental health problems are more at risk than others, so we have to do what we can, first, to improve scam prevention and, secondly, to ensure that when people fall victim they get the support that they need.

I have to pay tribute to the work being undertaken by the Money and Mental Health Policy Institute. It has drawn attention to how online harm can arise in a variety of areas: gambling, retail and financial offers. A number of recurring themes have emerged where action is needed, such as where people all too easily lose control of their transactions. There is also advertising and the way in which tools and techniques are developed that pressurise people into falling victim. The institute has concluded and demonstrated how, all too often, this behaviour goes unchecked, with regulation lacking or being poorly matched to what actually happens online.

While I understand the other issues that need to be addressed in the Bill and that led to the Bill, the problems of online financial scams are sufficient to deserve attention in the Bill.

My Lords, it is beyond any doubt that an Online Safety Bill is needed. The internet has been left uncontrolled and unfettered for too long. While the Bill is indeed welcome, it is clear that more work needs to be done to ensure that it adequately protects children online.

There is a substantial body of evidence suggesting that exposure to pornography is harmful to children and young people. Many have spoken in this debate already about the harm of easy access to pornography, which is carried into adult life and has a damaging impact on young people’s views of sex and relationships. For many young men addiction to pornography, which starts in teenage years, can often lead to the belief that women should be dehumanised and objectified. Pornography is becoming a young person’s main reference point for sex and there is no conversation about important issues such as consent. That is why the Bill needs to have proper and robust age verification measures to ensure that children cannot access online pornography and are protected from the obvious harms.

Even if the Bill is enacted with robust age verification, experience tells us this is no guarantee that age verification will be implemented. Parliament passed Part 3 of the Digital Economy Bill in 2017, yet the Government chose not to implement the will of this House. That cannot be allowed to be repeated. Not only must robust age verification be in the Bill, but a commencement date must be added to the Bill to ensure that what happened in the past cannot be allowed to happen again.

I know that some Members of the House are still fearful that age verification presents an insurmountable threat to privacy: that those who choose to view pornography will have to provide their ID documents to those sites and that their interests may be tracked and exposed or used for blackmail purposes. We live in an age where there is little that technology cannot deliver. Verifying your age without disclosing who you are is not a complex problem. Indeed, it has been central to the age verification industry since it first began to prepare for the Digital Economy Act, because neither consumers nor the sites they access would risk working with an age verification provider who could not provide strong reassurance and protection for privacy.

The age verification sector is built on privacy by design and data minimisation principles, which are at the heart of our data protection law. The solutions are created on what the industry calls a double-blind basis. By this, I mean that the adult websites can never know the identity of their users, and the age verification providers do not keep any records of which sites ask them to confirm the age of any particular user. To use the technical terms, it is an anonymised, tokenised solution.

The Government should place into the Bill provisions to ensure robust age verification is put in place, along with a clear time-limited commencement clause to ensure that, on this occasion, age verification is brought in and enforced. I support the Bill, but I trust that, as it makes its way through the House, provisions in it can be strengthened.

My Lords, I declare my interests as deputy chairman of the Telegraph Media Group and director of the Regulatory Funding Company, and I note my other interests in the register.

I welcome the Bill as the first rung on the ladder, ensuring that the unregulated, untransparent and unaccountable platforms begin finally to be subject to the legal strictures of regulation, accountability and transparency. In 1931, Baldwin famously said the press exercised power without responsibility. Now, the press is subject to intense regulation and tough competition laws, and it is the platforms exercising power without responsibility. This vital Bill begins the journey to rectify that.

It was an honour to sit on the Joint Committee and a huge pleasure to work with colleagues from across the House under the exceptional chairmanship of Damian Collins. In particular, the noble Baroness, Lady Kidron, brought such insight and energy to our work. I believe that, as a result of its work, the Bill strikes an appropriate balance between platform regulation, freedom of expression and the protection of quality journalism.

I will make just two points about the policy backdrop to this legislation. While regulation is crucially important, it is just one side of the coin: it must go hand in hand with competition. What is vital is that legislation to deal with digital markets and consumer protection follows swiftly. It is time—to coin a phrase—to level up the playing field between platforms and publishers.

For years, news publishers have operated in a deeply dysfunctional digital market, hampering efforts to realise fair returns for their content. Local and regional publishers continue to be hardest hit. Platforms generate a huge portion of advertising revenue from news media content: figures calculated by Cambridge professor Matt Elliott estimate UK publishers generate £1 billion in UK revenues for Google, Facebook, Apple and others each year.

The news consumption trend from print to digital means digital markets must function in a fair and transparent way to secure the sustainability of quality journalism. Google has more than a 90% share of the £7.3 billion UK search advertising market. That means platforms take news content for free and the bulk of advertising, which would pay for it in the analogue world, at the same time.

I welcome the fact that the Government will bring forward legislation to deal with this by giving the Digital Markets Unit statutory powers and tough competition tools. It will be a world-leading digital regulator alongside this world first in online safety, paving the way for a sea change in how platforms operate and ensuring the sustainability of journalism.

As a new age of regulation dawns, I join my noble friend Lady Stowell in urging the Minister to ensure speedy implementation of changes that are the vital other side of the coin. The Joint Committee said in its report that this should happen as soon as possible. Indeed, these two pieces of legislation will feed off each other. As a joint report by the CMA and Ofcom concluded:

“Competition interventions can … improve online safety outcomes.”

My other point is the fluid nature of the legal ecosystem surrounding the platforms, which the noble Baroness, Lady Chakrabarti, mentioned. For almost 30 years the US tech giants have benefited from the protection of Section 230 of the Telecommunications Act of 1996. Passed while the internet was in its infancy, it provided platforms with safe harbours in which to operate as intermediaries of content without fear of being liable for it, which is why we now have the manifold, terrible problems of social media we have heard about today, which the Bill is rightly addressing. But times have changed, and that backbone of internet law is under intense scrutiny, above all from the US Supreme Court, which has for the first time in quarter of a century agreed to hear a case, Gonzalez v Google, challenging the immunity of companies that host user content online. The court’s decision will have a significant impact on the internet ecosystem, especially taken alongside anti-trust legal actions in the US and the EU. They are issues to which we will inevitably have to return.

The Bill—along with many other developments that will have a profound effect on competition, on regulation and on the protection of children—ushers in an era of radical change, but is, as we have heard a number of times today, only part of the journey. Let us now move forward swiftly to finish that job.

My Lords, the power to amplify, together with the volume and speed of the online world, has put power in the hands of individuals and organisations, for better or for worse. While we seek to control the worst, we also have to be aware that we now have the most extraordinary communication tool for ideas, gathering others to our cause and getting information around the world in a flash, as well as providing avenues for those in countries that do not have the miracle of free speech to contact the outside world, because their media, and they, are state-controlled.

Of course, what is illegal offline is illegal online. That is the easy bit, and where my preference undeniably lies. The new offences, dealing with what were some of the “legal but harmful” issues, cover off some of the most egregious of those issues.

In our last debate on freedom of expression, I said:

“I want maximum controls in my own home. Put power in my hands”.—[Official Report, 27/10/22; col. 1626.]

The user empowerment now in the Bill will target things such as suicide content, eating disorder content, abuse targeting race, religion, sex, sexual orientation, disability and gender reassignment, and the incitement of hatred against people with those characteristics. But I will argue, as others have, that a default setting must be in place so that such material is not available unless chosen. Thus the algorithmic onslaught of content that follows a single search can be averted. More importantly, vulnerable adults, who may not be capable of selection and exclusion, need that protection. We do not have to view what we do not want to see, but let that be our choice before we are fed it.

Equally absent with the removal of legal harms is violence against women. The onslaught of misogyny, bullying and worse at women is dangerous and totally unacceptable. A whole raft of organisations are behind this push to amend Clause 36 to require Ofcom to develop a VAWG—violence against women and girls—code of practice. I hope and trust that noble Lords across the House will be in support of this.

I want cyberflashing—sending pictures of genitals, which thankfully is now an offence in the Bill—to be amended so that it is about not whether there was intent by the sender to cause harm, as in the Bill now, but that the sender must have consent. Women are sick and tired of being made responsible for male misbehaviour. This time, let it be on the men to have that responsibility.

On children, age verification is nowhere near strong enough in the Bill in its current form. I trust that this will change during the Bill’s passage. Like probably everyone in this House, I pay tribute to the noble Baroness, Lady Kidron, for all the work she does.

In our legislative endeavour, we must guard against authoritarian creep, where the prohibition against what is truly harmful oversteps itself into a world where we are to be protected from absolutely anything that we do not like or agree with—or, worse, that the Government do not like or agree with. As others have said, the powers of the Secretary of State in the Bill are Orwellian and need to be pushed back.

Free speech presents challenges—that is the point—but the best way to challenge ideas with which you disagree is to confront them by marshalling better ethics, reason and evidence. Life can be dangerous, and ideas can be challenging. While we must not submit our intellect and freedoms to the mob, we must protect the vulnerable from that mob. That is the dividing line we must achieve in the Bill.

My Lords, like other contributors to the debate, I support the Bill, but that does not mean that I think that it is perfect; we must be aware of letting the best be the enemy of the good. I declare my interests as a trustee of Full Fact and the Public Interest News Foundation.

I was very glad that the noble Lord, Lord Hastings, referred to the Broadcasting Act 1996, because, during its passage as a Bill, I was fulfilling the role that the Minister is performing today. I remember that, before coming to address your Lordships, I looked at the draft speech that had been prepared and which described the Bill at length. It was incredibly boring, and I said, “No, I am not going to do that; I want to describe to the House what the world that the Bill will bring into effect will look like”. I told your Lordships that I was taking them into a world of science fiction. In fact, I may have misled your Lordships on that occasion, because I underestimated the impact of the technology that was evolving. Also, I do not think that anybody realised quite to the extent that we do now that you cannot disinvent technology: things have happened which are here for ever from here on out.

While technology has changed, sadly one thing has not changed: human wickedness. Rather, human wickedness has been innovative. The Government tell us that they are great believers in innovation, but I do not think that they believe in innovation in this context. History suggests, and the contemporary world corroborates, that countering wickedness and vice is never easy, particularly when it is complicated by issues of jurisdiction, geography and technology.

My view is that this simply cannot be done by primary law or, indeed, secondary legislation. As the noble Baroness, Lady Stowell, touched on, we need all kinds of soft law and codes of conduct to complement that. She was right that we have to move on from the kind of legislative approach we have now, which I call “stop and start”. We have a period of intense debate in Parliament about a piece of legislation and then, as has been heard this evening, it is all forgotten for five years—and then you find that the piece of legislation you passed does not really meet the problems of the day. We must find a way of passing what I like to describe as “living legislation”, so that it is possible, in an ongoing way, to allow those things to evolve in response to the problems that the world is presenting. It is not simply a matter of a cosy relationship between the Government, the regulator, media companies, pressure groups, charities and so on; Parliament must be involved in doing what is, after all, its real job: law-making. I think that the public, too, need to know what is going on.

If I am right in saying so, and I think I am, this kind of static approach to law-making cannot really be what is needed in circumstances of the kinds we are talking about now. Parliament, this House and the other place together, should somehow take the metaphorical bull by the horns and evolve ongoing procedures to complement the technological evolution of the internet, which changes every day—indeed, things will have changed during the duration of the very debate we are having. I dare say that the same is true elsewhere, including in other sectors about which I know very little. If we, as parliamentarians, do not grasp this particular nettle, the consequence will be that the citizens of this country will materially lose control over quite a lot of what surrounds their daily lives.

My Lords, I am sure I have been annoying my noble friend the Minister for the last year by asking him when the Bill is coming. Today is one of those days when everything happens at once: two of my daughters are out of school because of industrial action, and I spent most of this morning arguing with them about whether they could go on the internet and how long they could spend on it. They said it was for homework, and I said that it was not and that they should read a book; you can imagine it.

There is a point to that slightly grumpy anecdote. First, I take issue with the suggestion by the noble Baroness, Lady Bennett, that your Lordships’ House does not engage with the next generation. More to the point, there is a fundamental tension that millions of parents up and down the country face. Our children are online a lot and sometimes we want them to be online. Do not underestimate the way that lockdown accelerated their online lives through home-schooling, necessarily—I declare my interest as a non-executive at Ofsted. Sometimes this was to their advantage, but I suspect on the whole it was probably not.

My concern is that while children should be able to get on and do their homework, we have allowed big tech to mark its own homework. The really appalling evidence that we have heard today underlines the urgency to get this Bill right.

The noble Lord, Lord Knight of Weymouth, hit the nail on the head—he usually does—about the speed and complexity of the technology; it is just so fast. Most parents that I know certainly do their best to keep their children safe. It is a bit like Sisyphus rolling the boulder up the hill; it just comes back down, because it is so much easier now for our children to be deceived, abused and bullied and to view the stuff of nightmares. When this includes pornography sites, which many others have talked about, with characters from children’s TV such as “Frozen” and “Scooby-Doo”, I do not think it is particularly dramatic to wonder what we have become as a society to allow this sort of thing to happen. I welcome the consensus that we have heard around the need to protect our children, although it tragically is too late for many. I am sorry that the process has dragged.

I will work across the House at Committee stage and beyond to make sure that the Bill is sufficiently stringent, that the scope is correct and that it is workable, because we cannot risk giving parents and young people false reassurance or weak new systems. The noble Baroness, Lady Harding, was very clear on this and I share her concerns about app stores not being in scope.

Going back to pornography, I know my noble friend the Minister takes these things extremely seriously, but I do not see how anybody can feel reassured unless the Government commit to robust age verification, as set out by my noble friend Lord Bethell.

In the time I have left, I want to address cyber flashing. I am very glad the noble Baroness, Lady Featherstone, did so too. I completely agree that it should be based on consent. I felt weary having to have these sorts of conversations again: about victims having to somehow prove that they are not overreacting, or if it was a bit of a laugh then it does not really matter. It makes no difference to their experience. I do not want to be presumptuous, but I think there is cross-party impetus to ensure that the new offence is based on a principle of non-consent, and I hope the Government will be prepared to listen. This is no criticism of my noble friend the Minister, who is an excellent Minister with an excellent team at DCMS, but it seems to me that these issues have been left in the “too difficult” pile for far too long and we must not miss our chance now that it is here.

My Lords, I do not think anyone in this House would disagree with the idea that freedom of expression is a very precious freedom. We have only to look around the world to see that authoritarian Governments almost invariably go after free speech as one of the first things that they do. We know that media freedom is a vital part of any democracy, as indeed is the rule of law, but as the noble Lord, Lord Black and the noble Baroness, Lady Chakrabarti, said, law has been pretty absent in this whole arena, even where it could have been used. I am glad that we are now addressing the complicated issue of regulating the internet and these platforms.

I do not want to see journalists’ privacy invaded so that their sources are exposed. I do not want any possible chilling effect on investigative journalism exposing corruption and abuse of power. It is vital to our democracy. However, we have to think very seriously about the kind of regulation that we have been discussing in this House, because it has been part of our tradition. Unlike the United States, we have not fetishised freedom of expression. We have seen that there have to be occasions when we restrict freedom of speech to protect people from serious harm. That is what this discussion today is really about and will be in the course of the Bill.

I declare that I am a trustee of 5Rights, which is the foundation created by the redoubtable noble Baroness, Lady Kidron. As a lawyer who is pretty well versed in the need for law, I have learned so much from her, and I believe that the major priority of this Bill has to be the protection of children. There are still gaps, and when the noble Baroness comes to put her amendments through, I will be there speaking in support of them. I hope that all noble Lords will come onboard, because those gaps definitely still exist.

I want to speak to your Lordships about women, because last year I chaired an inquiry in Scotland into misogyny. It was a very powerful experience to hear from women and women’s organisations about the extent to which women are abused on the internet. It was absolutely overwhelming that these were not only women in councils or parliaments, or women who were journalists or campaigners, but in schools and universities, women were being traduced and abused. Threats to rape, sodomise or sexually assault women, and to facially disfigure them with acid, would take place online and then you would find people piling in. The pile-on is something this House should know about. It is where, because of algorithms and because of people having followers, huge numbers of people then jump on the bandwagon and add their bit of insult and abuse to what has gone before. Or you get “likes”. I once saw a television documentary saying that the man who invented the thumbs-up “like” regrets it to this day because, of course, he now has children and knows how painful that can be. Also, that business of liking is telling women that there are hundreds and thousands of people out there who think that these things should be done to them.

I really regret to say that, of course, it is not policed. There are not prosecutions, or only very rarely, because of the cover of anonymity, which is problematic. We are going to have to discuss this during the course of this Bill because it gives a veil over those who do it. As well as the pile-on, one of the difficulties is—and I say this as a lawyer—the thresholds you have to pass for criminal prosecution. People have learned that you do not say, “I’m going to come and rape you”; they say, “Somebody should rape you. You deserve to be raped.” The message to women, therefore, is not, “I’m coming to get you”, but “Somebody out there just might”. It has an incredible effect on women.

We have to have that in mind when we come to Committee. We have to recognise the urgency, in relation to children particularly, but we also have to be alert to the ways in which women and girls are finding their lives made wretched. They are made fearful because of threats. Prosecutions and criminal prosecutions should be brought more regularly, because if there is anything that will stop this, it will be that. We have to be very vigilant about media freedom—I agree entirely—but we also have to make sure that we keep the Secretary of State out of this. I do not want to see politicians having their fingerprints on it, but the idea of a Joint Committee to monitor the way in which regulation takes place and to watch developments, because technological developments happen so quickly, is a good one.

We have to address algorithms. We heard from the Russell family that, even after Molly Russell had died, there on her technology she was receiving—it was being pushed at her—stuff about suicide, and the child was no longer alive. This is not about soliciting information; this is it being pushed in the direction of people. I urge this House, with all its usual great expertise, to make this Bill the best we can make it, certainly just now; but the priority first and foremost must be children.

My Lords, it is a huge honour to speak immediately after the noble Baroness, Lady Kennedy. She is one of my sheroes; she did not know that but she does now—and it will be recorded in Hansard. I declare my interest as CEO of the Muslim Women’s Network UK. Let me start by saying that the speech from the noble Baroness, Lady Kidron, was heartfelt; I will support the amendments that she plans to put forward.

I will focus on four areas of concern: the abuse of women and girls; pornography; extremist and misogynistic content; and digitally altered body images. First, I share the concerns that have been raised many times today by noble Lords on the gaps in this Bill to tackle the online abuse and harassment of women and girls sufficiently. I therefore support the call from the noble Baroness, Lady Morgan, to introduce a code of practice.

Secondly, on pornography, I strongly support the recommendations from the noble Lord, Lord Bethell. I will also support any amendments that he plans to table. Inaction by successive Governments to tackle easy access to pornography by children has led to harmful sexual behaviour towards women and girls. This Government must go further to strengthen age verification. There is plenty of technology to do this. It can and should be implemented without delay.

Thirdly, there is a lack of accountability when it comes to publishing extremist and misogynistic online content. I am concerned that, according to the vague definition in the Bill, any online platform can call itself a recognised news publisher and then be exempt from complying with any requirement in the Bill. This will result in online platforms being free to promote harmful hate speech, including misogynistic content, and not having to remove it.

Finally, another urgent concern is the digital alteration of body images and sizes in advertising. Although boys are exposed to digitally altered images of men, girls are exposed to a far greater number of images of women that are highly manipulated and altered. Editing images of models involves taking inches off bodies and faces. The manipulation of images in this way is causing serious long-term harm, contributing to low self-esteem, anxiety, depression and self-harm and driving young people to cosmetic surgery. Given that advertisers are promoting an unattainable body size, this type of online communication is fraudulent and harmful; it therefore can and should be addressed in this Bill.

Earlier, the noble Baroness, Lady Merron, raised concerns about disinformation and misleading material being widely available and causing harm. This is a prime example of that, but it is often overlooked. I know that Luke Evans has introduced a Private Member’s Bill in the other place; however, this Online Safety Bill provides a prime opportunity to tackle this issue now. I urge the Government to listen to the serious concerns being raised by many campaigners, including Suzanne Samaka, founder of the campaign #HonestyAboutEditing. Other countries, such as Israel, France and Norway, have already taken decisive action by legally requiring altered images to carry a label. The UK has been left behind the curve. How will advertisers be held accountable? Will the Government consider legally requiring advertisers to label digitally altered images? Can the Minister inform the House of any alternative plans to tackle this harmful practice by advertisers, such as introducing a code of practice?

There is a common thread in all the concerns that I have shared today: how the weaknesses in this Bill will have a disproportionately negative and harmful impact on the lives of women and girls. If this Government are serious about protecting women and girls from harm, they must take a more holistic, robust approach to their safety.

My Lords, finally, the long-awaited Online Safety Bill arrives. The noise preceding it has been deafening. It is noise that we should be proud of because it is the sound of a healthy democracy deliberating on some of the most crucial issues in our society; between privacy and security, sensitivity and freedom of speech, it goes to the integrity of our democracy.

These are not new issues at all, but the context is. Online safety is as broad as the landscape it inhabits, making this Bill of great complexity. I support it. Most of us in this Chamber grew up without the internet—something that our children find a total anathema. Now, it equates itself with something as common as the air we breathe. However, it is not as universally available, for access is controlled by a small number of tech companies that have for years declared themselves platforms and dodged responsibility for content. So a sort of terrifying social anarchy seems to have emerged, where no one is accountable or responsible for anything. This offers a free space for terrorists, easy access to pornography, hate speech and bullying. Social media is available 24/7, 365 days a year, which has driven some of our children to despair. We face growing concerns about how our democracy is being undermined and manipulated—about what is real and what is a Russian bot. Regulation was always coming, but the question is: what sort? We should always be mindful that we do not want the sort of highly censored internet we see in China.

How do we effectively regulate something like the net, which shifts like sand? I have a few points. First, I support the establishment of a duty of care for legal but harmful content for children. In my mind, censorship around only what constitutes legal content falls woefully short of creating the sort of nurturing and safe environment we strive to create elsewhere in society for our children, whether in family units, at school or within the wider community. It is said that it takes a village to bring up a child, but now that village is online. However, we must be transparent about how we do this.

That brings me to my second point: we must avoid censorship with no transparency—whether it is by a government or a tech company—for it is only transparency that guarantees accountability.

Next, I turn to the point about anonymity, which the noble Baroness, Lady Kennedy, also raised, among others. It is my belief that the assumption in favour of anonymity on the web encourages people to be the worst, not the best, version of themselves. It gives disguise to trolls and bullies, and allows no off button and no shame. I support steps to encourage platforms to verify users’ identity. I understand that there will be some who cannot, such as victims or dissidents, but they can be drawn to sites that are known to protect them. Then there are those who will not, who can seek less mainstream sites, which we, as users, can choose not to use.

Fourthly, we should be doing more to address the challenge of the health of our democracy and the quality of discourse that underpins it. The insidious power of algorithms is driving us to echo chambers and polarising debate. We have lost a sense of a common truth, and with it what forms a lie. This is especially concerning around election campaigns, where fraudulent advertising or disinformation may be difficult to judge and may sometimes come from foreign agents. And what of spending limits? We carefully constructed these through Electoral Commission rules, yet there is a free-for-all on the web. I believe there is more we should do to secure the integrity of the poll online.

Some of the smartest people in the world created the internet; there is no reason why they cannot fix some of its worse characteristics. This is the first of what will surely be many Bills about online safety and how we regulate the internet. While we must strive to protect, we must also be mindful of the boundaries between privacy and security, and freedom of speech and censorship. These are questions which have run for generations through our democracy and always will. We must understand and be honest with ourselves that, while this is a battle worth fighting, it is a battle we will never entirely win.

My Lords, I speak in this Second Reading debate with little detailed knowledge of the digital world. I will probably be taking up my noble friend Lord Allan’s offer. I am not on Facebook, TikTok, Instagram or Snapchat; I have occasionally dabbled on Twitter. What I do have is 40-plus years’ experience as a teacher and head teacher. I have seen first-hand how children can have their lives turned upside down and how they have been physically and emotionally scarred by the effects of social media and the online world.

Yesterday, we heard from a study by the Children’s Commissioner for England how children as young as nine are being exposed to online pornography; how a quarter of 16 to 21 year-olds saw pornography while still at primary school; and how, by the age of 13, 50% had been exposed to it. You might say, “So what?” Do we want to hear that 79% of 18 to 21 year-olds have seen pornography involving sexual violence while they were still children? Do we want to hear that a 12 year- old boy had strangled a girl during a kiss because he thought that was normal? Do we want to hear that half of young people say girls expect sex to involve physical aggression? This all comes, by the way, from the Children’s Commissioner’s report.

The Online Safety Bill, as we have heard, has been a long time coming. The Government’s aim in introducing the Bill is to make Britain the best place in the world to set up and run a digital business, while simultaneously ensuring that Britain is the safest place in the world to be online. But does the Bill really achieve that for children? Childhood is about loving and learning. It is about innocence and enjoying the wonders of life. It is not about having that innocence and wonder shattered by some perverse online content.

My interest in this Bill is how we as a society can restore childhood to our children. The Bill, as the noble Baroness, Lady Kidron, said, must cite the UN Convention on the Rights of the Child, and General Comment 25 on children’s rights in relation to the digital environment. Citing this in the Bill would mean that regulated services would have regard to children’s existing rights. The limited scope of the Bill means that, as the 5Rights Foundation points out, children will still be exposed to harmful systems and processes, including blogs and websites that promote and encourage disordered eating, online games which promote violence, financial harms such as gambling, and parts of the metaverse which have yet to be developed. The Bill will not be future-proofed. Regulating only certain services means that online environments and services which are not yet built or developed are likely not to be subject to safety duties, which will quickly make the Bill out of date.

Turning to age verification, as a teacher it always worries me that children as young as seven or eight are on Facebook. In fact, 60% of UK children aged eight to 12 have a profile on at least one social media service. Almost half of children aged eight to 15 with a social media profile have a user age of 16 plus, and 32% of children aged eight to 17 have a user age of 18. Without age assurance, children cannot be given the protections needed to have an age-appropriate experience online. Some 90% of parents think that social media platforms should enforce minimum age requirements. We should do whatever we can to protect children from harm. The Bill will establish different types of content which could be harmful to children:

“primary priority content that is harmful to children … ‘priority content that is harmful to children’ and ‘content that is harmful to children’”.

I say that any content that is harmful to children should be dealt with.

As the noble Lord, Lord Hastings, has said, media literacy is hugely important to this Bill and should be included. Media literacy allows children to question the intent of media and protect themselves from negative impacts, be it fake news, media bias, mental health concerns or internet and media access. Media literacy helps children and young people safely consume the digital world. I was a bit disappointed that the noble Lord, Lord Hastings, did not ask what a Liberal Government would do, but I can tell him that we would be dealing with this issue.

Yesterday, the Princess of Wales launched a campaign to highlight the importance of childhood. Children need to enjoy their childhood and grow up in a supportive, caring environment. They need good role models, not influencers. Children are very vulnerable, innocent and susceptible. We must do all in our power to ensure that online is a safe place for them, and to be able to say to the daughter of the noble Baroness, Lady Harding, that we did finally do something about it.

My Lords, it is clear from the last speech that we must do much more to protect impressionable young people from the torrent of racism, extremism and dangerous conspiracy theories online. Sites like Facebook and Twitter fuel division, anger and extremism, which can lead to threats and violence. Small sites like 4Chan, Odysee and Minds do not even have the third layer of the so-called triple shield. People are routinely targeted, intimidated, bullied and harassed, as we have heard in so many speeches during this debate. This has a terrible impact on public debate, let alone mental health.

Research by the Antisemitism Policy Trust revealed that there are two anti-Semitic tweets per year for every Jewish person in the UK. That report was before Elon Musk’s takeover relaxed the rules. As we have heard from the noble Baroness, Lady Anderson, women get it worse, with all sorts of disgusting abuse and even rape threats. Yet the Government have so far not accepted calls for an Ofcom code on violence against women and girls. Anyone of any age can set up a Twitter account with just an email address, giving them access to hardcore pornography. Future generations will be amazed that we allowed this lawless wild west to develop.

Sites like Twitter allow the repeated publication of completely false, defamatory and made-up images, making completely unfounded allegations of the most vile behaviour. It ignores complaints, and even when you to try to take them up and can show clearly how posts break its rules, it will not do anything about it. Twitter’s entire business model is based on fuelling argument, controversy and anger, which obviously leads to abuse and in some cases threats of violence. This can become addictive, leading to a terrible impact on people’s mental health.

People abroad are making billions out of poisoning public debate and making the mental health of vulnerable people worse. Imagine it: who would be allowed to set up a business to deliver anonymous hate mail about other members of the public through people’s front doors, which is essentially what Twitter is able to do? Why are we allowing billionaires abroad to decide what young people in the UK are subjected to, instead of Parliament, which is accountable to the public, setting rules that are properly understood?

You do not need to be paranoid to ask why hostile countries might use social media to undermine western societies with extremism and violent argument. This is not about limiting free speech or censorship—remember, these sites already curate what we see anyway—but implementing proper systems of age verification and holding the executives to account when they break the rules.

I share the concerns of the noble Baroness, Lady Fall, about whether we really need anonymity on social media in the UK. Freedom of speech should not allow threats of violence or rape, or disgusting abuse. In any event, people have the freedom to say what they like, within the bounds of the law, but that does not mean they should not be held responsible for it. Nor is it true to say that this would affect whistleblowers in countries like ours. The people who make rape threats or publish violent abuse are not whistleblowers.

Finally, as the noble Lord, Lord Black, said, the boundaries between newspapers, broadcasters and social media companies are becoming more blurred all the time. Twitter and the rest of them are clearly publishers. They should be held to account for the material on their sites, in the same way as newspapers.

We need to see small, high-harm platforms brought into the scope of category 1 platforms; the re-introduction of risk assessments for legal harms; and a reversal of the current fudge on anonymity, with at the very least fines for platforms that are unable to know who their customers are. We need to look again at the status of these companies as publishers. Finally, we need to see action on search engines, including Google, which largely escape any actions in this Bill.

My Lords, we are privileged to live in an age of internet technology, which gives us greater access to information and means of communication than at any point in human history. But to get the most out of this online world it must be safe and effectively regulated to counter harm and misinformation. I fully support the Bill. It is a good start, but it needs to be improved in a number of areas.

I begin by paying tribute to the noble Baroness, Lady Kidron, for her tireless work in this area, and for educating and helping us to focus on some of the core and fundamental issues. I will underline some of the amendments she proposes that I intend to support.

As the noble Baroness said, the Bill primarily focuses on user-to-user services and search engines, as defined in Part 2, but harmful content published on websites such as blogs falls outside the Bill’s scope. The noble Baroness’s amendment to include within the Bill’s scope any internet service likely to be accessed by a child is crucial. I strongly support it.

The Bill must also address business models that drive users to this content. As we know, this occurs through platforms, algorithms and push notifications, which amplify and perpetuate access to this content, as illustrated by the tragic death of Molly Russell. I support the noble Baroness’s amendment to Clause 10, which would ensure tough regulation in this area and assessments to tackle drivers of harm, including the design and features of the platform.

Furthermore, online safety should apply not just to children. The 2019 White Paper said that content that actively harmed any user should be tackled. It is deeply regrettable that the Government removed adult safety duties from the Bill, arguing that this would undermine free speech. On the contrary: online safety for all has the potential to enhance free speech, as people can engage on platforms without being exposed to harmful content. I urge the Government to reverse this decision.

The importance of balancing privacy online with the need for public safety is of course crucial. Encrypted messaging services such as Facebook Messenger or WhatsApp are right to keep private messages confidential, but the Government have argued that there are situations where law enforcement agencies must have access to messages on these platforms. Can the Minister explain how they intend to balance privacy and online safety with regard to encrypted messaging services?

Age-verification regimes need to be strengthened to ensure that children are not exposed to pornography. The noble Lord, Lord Bethell, made a very powerful case for that, and I strongly support the amendment which he will bring forward. His proposed amendment, he said, would bring Part 3 of the Digital Economy Act into Part 5 of the Online Safety Bill. As he said, this offers a very neat solution to addressing the significant gap in the Bill, and would make the definition of pornography online consistent with regulation of content in the offline world. I also support the amendment of the noble Baroness, Lady Kidron, to Part 4, which would task Ofcom with producing statutory guidance for age assurance. I also support her amendment to Part 7, requiring platforms to provide a point of contact to bereaved families or coroners when they have reason to suspect that a regulated service holds relevant information on a child’s death, and an amendment requiring social media platforms to share information with coroners in cases like Molly Russell’s.

While this Bill is about the online safety of children, this is an opportunity to include online fraud provisions in legislation, which predominantly affect the elderly. We need a regime where law enforcement, financial services and tech platforms collaborate to reduce online fraud. Would the Government be willing to entertain an amendment that encouraged such collaboration, to ensure that user-to-user platforms and search engines are accountable for fraudulent advertising on their platforms?

The Government have signalled that they will put forward an amendment that will classify videos of people crossing the channel which show the activity in a positive light, which I of course support. Can the Minister assure the House that this amendment, intended to target those who encourage people smugglers, will not criminalise those who show sympathy online for asylum seekers?

Finally, civil liberties groups have described social media as a modern town square. To make sure that this town square is used positively, we need robust provisions for media literacy. A new media literacy duty in the draft Bill has been dropped; now it is mentioned only in the context of risk assessment, and there is no active requirement for internet companies to promote media literacy. There is a wide media literacy gap which leaves many at risk of harm. I agree with Full Fact that a stronger media literacy duty should be reinstated with Ofcom in this legislation to produce a statutory strategy.

Finally, this is a fast-changing area, as others have said. While we can improve this Bill, we cannot make it perfect. I therefore strongly urge that a commitment is given by the Government to subject this legislation to post-legislative scrutiny after three years.

My Lords, this debate has attracted a lot of attention: some 60 speakers, nearly all of whom have run over their time. I will just make one or two observations. First, it is a long time that we have been waiting for this Bill, so we had better make a good job of it, because I doubt that the Government will let legislation through again for a good five or six years. The second point—I pick up something that my noble friend Lord Inglewood said—is that we need more flexibility in the law. The speed at which the internet has developed is not appropriate for the procedures that we have. It is no good saying that you can have a Henry VIII power, give it to a Minister and then forget it; we need to devise a method of reviewing laws on a regular rolling basis, such as they have in the United States, because the law will be out of date whatever we do.

I am fully behind the amendments of the noble Baroness, Lady Kidron, and my noble friend Lord Bethell. I think that they are excellent amendments, and I look forward to us discussing them. We do not need to do that now.

I would add into the procedures that we need to give careful thought to the idea of anonymity on the internet. I am against it, personally. I am a member of the Conservative Home page and I am there as “Richard Balfe”. Some people are there with very odd names, such as “Brussels Hater” and other handles which do not reveal who they are. I notice that the more obscure the name is, the more violent the contribution is. We need to look very carefully at anonymity; the people who need to hide behind anonymity are probably not the sort of people that we, in considering this Bill, would see as the best people to do things.

My next point is about penalties. The penalties look fine—for example, 10% of world turnover—but of course these are not penalties on the firms; they are business expenses, and that is how they will be seen. I am not a great admirer of the American system but I will say one thing that came out of a visit I paid to Washington. I talked to legislators about how they enforced legislation—in this case it was against financial firms—and the Congressman I was speaking to said, “It is very simple: you imprison them”. He said that if a Bill has a possibility of imprisonment, it puts the fear of God into directors in a way that no fine, however big, does, because that is a business expense and can be planned for. We need to look carefully at whether there should a custodial element in the Bill for severe breaches. I think that would help to get it implemented. Otherwise, the danger I see is that we are in competition with lawyers based in Hollywood rather than with people based in London.

I look forward to the Bill passing; I hope we will do it carefully and considerately—I am sure we will—and take onboard the amendments of my noble friend Lord Bethell and the noble Baroness, Lady Kidron, and the other improvements which have been mentioned.

My Lords, I want to focus primarily on the safeguarding of children. I support the general provisions and intent of this Bill; it is clearly going to be a very important tool in keeping children safe online.

While the internet has so many benefits, it exposes children to myriad harmful content, such as pornography and content promoting self-harm and suicide, as well as targeted abuse and grooming. Molly Russell’s name has become synonymous with the Bill, and it is important that we get this legislation right so that harms online, such as those Molly encountered, are not just reduced but eliminated. We need to make the online world as safe as it can be for our children.

We know that young children are able to sign up for accounts on social media platforms with little or no protection from the harms they face; they are able to freely access pornography without restriction. It is shocking that over 60% of children under 13 have accessed harmful content online by accident. To safeguard children and young people thoroughly, we need to ensure that the protections for children offline are mirrored online. I fully endorse what the noble Lord, Lord Bethell, said and I will be supporting him in the amendments he brings forward. I also support those that will be brought forward by the noble Baroness, Lady Kidron.

In the offline sphere, under the Video Recordings Act, the British Board of Film Classification, for example, is responsible for classifying pornographic content to ensure that it is not only not illegal but meets established standards. None of these offline standards is applied online at all.

The online pornography industry has developed and evolved without any—never mind robust—regulatory oversight. But, given what is available online, much of which is illegal, oversight is greatly needed and overdue. The Bill provides the opportunity to put that right, and we must not miss this opportunity because, as we have heard, this may not return for some years.

Age verification was supposed to be implemented under the Digital Economy Act. As a result of the Government’s decision not to implement Part 3 of that Act, children have had unfettered access to pornographic content. Therefore, in my view, age verification needs to be implemented as swiftly as possible. A coalition of charities are proposing that Ofcom must prepare and issue a code of practice within four months of Royal Assent and that age verification should be implemented within six months. That is the very minimum that we should expect. We owe it to our children that they are not exposed to any more harm than they have been already.

Much of the debate in the other place on the issue of free speech focused on the Bill’s provisions to regulate what is legal but harmful. It is important that we ensure that the provisions of this Bill protect free speech, while at the same time protecting vulnerable people against deeply damaging material and content. The Bill now places a duty on user-to-user services

“to have particular regard to the importance of protecting users’ right to freedom of expression within the law”.

We need to examine the operation of this duty very carefully. The Bill must reflect the principle and the law must reflect the principle that whatever you can say offline on the street should be protected online. Large internet companies should not have the power to decide what is said and not said online. If a company removes speech that would be legal offline, it must be placed under an obligation to give reasons why that speech was removed and be held to the highest standard of accountability for removing it.

Social media companies are enormous cartels that dominate our culture. The Government, in bringing forward this Bill, have concluded that they cannot be trusted with users’ safety. They cannot be trusted to keep their platforms safe, and equally they should not be trusted with free speech. I want also to endorse those noble Lords and Baronesses who have called for action to be taken against the awful abuse and trolling of women and girls online, and particularly the use of anonymous accounts. This issue needs to be tackled, and I look forward to working with others in Committee to strengthen the Bill in all these safeguards.

My Lords, they say that there is no such thing as a free lunch. When it comes to the social media companies, that is certainly true. Google Search is free, as are Facebook, Twitter, Instagram, WhatsApp, YouTube, TikTok and a host of other online services. All of them are great products, hugely popular and used by billions of people every day throughout the world. So it begs the question: why are they free? It is because the mass of data that the internet companies hoover up on their billions of users is a treasure trove. They collect data such as location, shopping, searches, medical records, employment, hobbies and opinions. It is said that Google alone has more than 7,000 data points on each one of us. In our innocence, we all thought that we were searching Google; little did we realise that Google was searching us.

What do they do with this hoard of data? They synthesise it through algorithms. They sell their results to advertisers. Traditionally advertisers spend huge amounts on newspapers, television and other media, struggling to target their markets. It was imprecise. Today, using the data provided by the social media companies, advertisers can personalise their message and pinpoint it accurately. It is hugely cost effective and it generates hundreds of billions in revenue. Data truly has become the new oil.

Of the five largest companies in the world by market value, four are big tech: Apple, Microsoft, Alphabet/Google and Amazon. Indeed, Apple alone has a market value equal to the combined value of all the companies on the FTSE 100 Index. Big tech is bigger than most countries. The big tech companies are richer than us, they move faster than we do, they are aggressive, they are litigious, they are accountable to no-one, they have enormous power, and they make their own rules. They employ the smartest people in the world, even including a previous Deputy Prime Minister of our country.

The Zuckerberg shilling can buy a lot of influence. Let us take a look at Facebook. Its platform has allowed the most unspeakable acts of violence, hate and perversion to go viral, pretty much unchecked. It says that it moderates content, but it is not enough, and usually too late. Now we learn that Mr Zuckerberg is spending $10 billion a year on developing his metaverse. Already we have read of examples of virtual reality sex orgies, and participation in gruesome violence, all viewed through a Meta headset, where the avatars are quasi-people and it becomes almost impossible to distinguish reality from fiction. Imagine where that is all going. Frances Haugen, the Facebook whistleblower, had it right when she said that only profit motivates the company.

This is a landmark Bill. We have to get it right, and we have to make it tough, with no room for loopholes or ambiguities. I have tried to paint a picture of the participants. I have worked and been involved in the digital industry for over 50 years. I know the nature of the beast. They will fight to the last to preserve their business model. Do not underestimate them. These people are not our friends.

My Lords, several Peers have mentioned the Digital Economy Act 2017 and the sadness of the constitutional impropriety when the Executive refused to implement the will of Parliament. That really concerned me because, if it had been implemented, so many children would have been protected, for several years by now. We learned some useful things during its passage that could very much be applied in this Bill.

The first was on enforcement. This is always the big problem: how do you make them comply? One of the things that will work is the withdrawal of credit card facilities. If a Government or authority ask credit card companies to withdraw facilities from a company, they will, probably internationally. In fact, this happened not that long ago, a few months ago, to one of the big porn sites. It soon fell into line, so we know it works.

The other thing is that anonymous age verification is possible. At the time I chaired it, the British Standards Institution issued PAS 1296 on how to do it, and several companies implemented it. The website itself does not check; it is done by an external company to make sure that it is right. The noble Lord, Lord Browne, has just explained exactly how it works. It was a very good explanation of the whole thing. About a year ago, they were intending to elevate it to an international standard because other countries wanted to use it. Certain European countries were very keen on it and are already implementing stuff.

The other thing that struck me is this: what is meant by “legal but harmful”? It is an expression that has sort of grown up, and I am not sure whether it means the same thing to everybody. In terms of pornography, which I and a lot of us are worried by, we do not want to be a modern Mary Whitehouse on the one hand, so you do not want to regulate for adults. But the noble Baroness, Lady Benjamin, who worked on this, explained all the dangers very well, as did several others. It is not just that children get addicted; they also do not learn how to treat each other and get completely the wrong impression of what they should do. In fact, horrifyingly, I heard that throttling, for instance, is on the increase because it has apparently been appearing on porn sites recently. It does not take long to corrupt the next generation, and that is my real concern: we are destroying the future.

To future-proof it, because that is the other worry, I would suggest quite simply that access to any website, regardless of size, that has any pornography must have anonymous age verification. It is very simple. We may not want to prosecute the small ones or those that do not matter, but it allows us to adapt it to whoever is successful tomorrow—because today’s success may disappear tomorrow, and a new website may come up that may not fall within it.

The other thing I want to mention quickly is that anonymity is necessary because it is not illegal, for instance, for any of your Lordships’ House to go and access pornography, but it is severely career limiting if anyone gets to know about it—and that is the trouble. The same thing applies if you are a Muslim leader and wish to buy some alcohol online. That is why we need to have this. It is perfectly possible, it is out there and lots of companies can do it.

Finally, what is misinformation? It is really the opposite opinion of what you yourself think, and I think there are huge dangers in how we define that.

My Lords, like every noble Lord today I welcome the arrival of this Bill after such a long wait. Like many others, I will focus my remarks on pornography. Standing back and looking at what has shaped and is shaping our society, we cannot ignore the fact that never in the history of humankind have we been so deluged by pornography. Graphic sexual activity is accessed through the internet on the push of a button and is almost impossible not to stumble across.

What we have been missing up to recently is the data on what this deluge is doing to us all. We have heard a lot about the Children’s Commissioner and her report yesterday. Her research found that most young people have seen pornography on Twitter, Instagram or Snapchat. Moreover, online pornography is not the same as the blue magazines previously available only by reaching up to the top shelf of the newsagents for those who had the chutzpah in those days to do it.

The adult content accessible in our youth is, she says, “quaint” compared to today’s online pornography displays. Pouting page three-type nude stills have given way to video portrayals of degrading, sexually coercive, aggressive, violent, pain-inducing and exploitative acts being perpetrated particularly against teenage girls and, of course, younger children. The title of her report published this week—'A Lot of it is Actually Just Abuse—says it all. She highlights the dangers of the normalisation of sexual violence and the template this provides for children’s understanding and expectations of sex and relationships.

Her stats are a litany of innocence despoiled. Half of children have seen pornography by age 13; some 10% by age nine and more than a quarter by age 11. Some 79% see violent pornography before age 18 and frequent users are more likely to engage, as we have heard, in physically aggressive sex acts.

While I am most concerned about the impact of pornography on children and young people, we cannot ignore its prolific use by adults. International studies show high frequency of pornography use is associated with poor semen quality and reproductive hormone quantity, as well as erectile dysfunction with flesh-and-blood partners. Meta analyses show pornography use is never positively associated with relationship quality.

So, while the Bill has been much strengthened, I will support noble Lords, such as the noble Lord, Lord Bethell, who table amendments requiring that: first, all pornography websites and social media platforms implement third-party age verification; secondly, that there is a Bill-wide definition of pornographic content; thirdly, that online pornographic content is regulated in the same way as offline; and, fourthly, that all pornographic sites must ensure actors are genuinely over 18 so they are not facilitating child sex abuse.

To reiterate, the pornification of society is skewing our values and practices towards cruelty and selfish gratification in intimate relationships. It is undermining efforts to tackle abuse and violence, particularly against women and girls. Not bringing Part 3 of the Digital Economy Act 2017 into force was a dereliction of duty. I was involved at Report stage, and strong forces were clearly at work to preclude hampering adult access to pornography—even to material that would have been illegal offline. The priority then seemed to be securing adults’ continued access to violent, misogynist, racist and degrading material and protecting their privacy.

Almost six years on, my and others’ plea is that we strike a better balance: introduce an age-verification regime at the speed befitting this public and mental health emergency. Third-party providers can give adults the privacy they crave and children the protection to which they are entitled in a civilised society. For too long, we have bowed a knee to cyber libertarian ideology that says internet regulation is impossible, unworkable and unwanted. This Bill must take big, bold, well-evidenced steps to reverse the decades of harm this ideology has caused.

My Lords, I am very thankful to be in this House to discuss this Bill. I know many Lords have commented on the Bill being rather late but, being a relatively new Peer, I am pleased to be able to contribute to this debate—it is something I have been active on, in another place, for quite some time. So I congratulate the Minister on bringing the Bill to the House.

Everything that a person sees on social media is there as a result of a decision taken by the platform that runs it—a point very powerfully made by the noble Baroness, Lady Kidron—and we have heard the tragic outworking of that for children in this debate. In an article in the Daily Telegraph on 13 December last year, it was reported that Meta knew it was prompting content harmful to teenagers—that was in an internal document leaked to CBS News. It suggested that Meta knew Instagram was pushing girls toward dangerous content.

I will not repeat the many valuable points that have been made on the safety of children—I support them all and will be supporting the amendments from the noble Baroness, Lady Kidron—but I want to make a number of further points, some of which are unfortunately born from personal experience, somewhat like those the noble Baroness, Lady Anderson, made earlier. Women and girls are disproportionately affected by abuse online. While I do acknowledge the user empowerment duties in Clause 12 and the triple lock, I am concerned that the Government’s proposals do not go far enough to protect women and girls. They put an onus on individual users to protects themselves, and while the individual can choose to opt out, it does not protect millions of others from being able to see the content.

As well as fearing for vulnerable women and girls who see such content, I am concerned at the chill factor to women and girls getting involved in public life. Many potential political candidates have said to me that they could not go through what I endure online, and so they do not. That is not good for democracy and not good for encouraging women to come forward. Therefore, I support the proposal to produce a code of practice on violence against women and girls modelled on Carnegie UK’s previous work on hate speech, and that it should be introduced as an amendment to Clause 36. I thank Carnegie UK for its work, over a long period of time, on these issues.

Additionally, it has to be said that some of the trolling against politicians and people who speak out on issues is undoubtedly orchestrated. I hope that that level of orchestration by vicious online mobs—the pile-on that the noble Baroness, Lady Kennedy, referred to—can be looked into as well. I hope the Minister will be cognisant of that point.

I am pleased that anonymity has been raised in the Chamber this evening. The argument goes that if everyone had to be identified and verified online, this would prevent whistleblowers and others, such as the victims of violence, coming forward and speaking out, so they need anonymity. I understand that argument but, given that the majority of abuse and criminal activity comes from anonymous accounts, surely there could be a way to protect genuine free speech users from those who overstep the line and threaten violence. I believe this could be achieved by platforms holding the ID of users behind a firewall that could be breached only if there were reasonable grounds to suspect that a criminal offence had been committed. There are those who use anonymity as a cloak of protection from criminal law. That needs to be challenged. I recognise that this is a cross-jurisdictional issue. However, it is one we need to tackle in this House.

Finally, I support and endorse the amendments being brought forward by the noble Lord, Lord Bethell, on those under 18 accessing pornography, particularly on robust age verification and a clear definition of pornographic content. I commend the work of the noble Lord and the coalition of NGOs that have been working with him. I thank them for their clear papers on this issue.

I support the principle of the Bill, but we will have a lot of work to do to strengthen it. I look forward to taking part in that.

My Lords, I shall attempt to be brief but, based on previous experience with other speakers, that may be difficult. At least it gives the Whip on the Front Bench the chance to do some agile body moves.

I welcome this overdue Bill. I think the Minister got it slightly wrong when he congratulated us on waiting patiently for it. Judging by every single contribution around the entire House today, patience has been rather wanting. We want to get on with it. Like many government Bills, this has grown like Topsy. It has grown sideways, downwards and upwards. We need to beware of going around in circles. Above all, we need to expedite this and get it on the statute book.

I will focus on three key areas. Unsurprisingly, the first will be children. Here I declare that I am a governor of Coram, the oldest children’s charity in the United Kingdom. I will certainly support amendments such as those that the noble Lord, Lord Bethell, was talking about to try to bring in proper age verification.

Like many other noble Lords, on Monday I had the privilege of sitting in on the briefing that the noble Baroness, Lady Kidron, arranged. Ian Russell, the father of Molly Russell, was present, together with one of her sisters. What we saw was truly shocking. In some ways it was particularly shocking to me because, as Ian shared some of his daughter’s diary—what she had actually written in the days and weeks before she died—I had a sudden jolt of recognition. What 14 year-old Molly was saying was almost identical to the transcript of the suicide note that my father wrote to my mother, which I have in my desk at home. It has the same self-loathing, the feeling of worthlessness and the belief—completely wrong—that you would better serve those you love and live with by departing from this life. My father was a Second World War veteran who had won the Military Cross. He was suffering from manic depression and was clearly in a depressed state, but I cannot even begin to imagine the effect it must have had on Molly to have the deluge of filthy, negative, awful, harmful content that she was deluged in 24 hours a day. Perversely, the more she looked at it, the more excited the algorithm got and the more she received.

Particularly disgraceful is that it took no less than five years for the family and their lawyer finally to get some of the platforms Molly had been watching to disgorge and show some of the content she had been viewing. Five years is wholly and utterly unacceptable.

I take the point that the noble Baroness, Lady Bennett, made about young people being involved. It would be a good idea for Ofcom in some way, shape or form to have access to young people advising it. I support in principle the idea of a Joint Committee of Parliament. Again, I think it would be very helpful to have young people advising that.

The second area is supporting the wonderful noble Baroness, Lady Kidron. I declare quite openly that I am a Beebanite. I think there are quite a few of us in the House, and we will do everything we can to support the wonderful noble Baroness in everything she does.

Lastly, I come to the companies. I speak as somebody who was a head-hunter for 30 years. A large part of our business was in North America and—surprise, surprise—a lot of our most wonderful clients were some of these new tech giants. I know a lot because of that about what I would call the psychology of attraction and repulsion. I can tell the House that for many years, on going to a candidate and saying, “Would you like to join Facebook? Would you like to join one of these other companies?”, they would get pretty excited, because it is new technology, there is a lot of money, it is sexy, it is probably in California—what could be better?

We have to change the paradigm in which people look at potentially being employed by those companies. We have to create a frisson of fear and forethought that, if they do join forces with those companies, not only might their personal reputation suffer but the reputation of the company will suffer, shareholders will suffer, and those who provide services to that company, be they banks or lawyers, will also suffer. That is what we need to change. I will do everything I can, working with others who probably know rather more about this than I do, to concentrate on getting into the minds of those companies, which have huge resources, legal and financial, to resist whatever we do. We have to get inside their minds, find their weak points and go for the jugular.

An offence in this Bill is an offence under the law of any part of the UK. There is a complex interplay between online safety, which is reserved, and devolved matters such as child and adult protection, education, justice and policing. I realise that the legislative differences between Scotland and England are quite topical. The offence, for example, protecting people with epilepsy does not cover Scotland as Scottish law already covers this behaviour, as is the case with the new cyberflashing offence.

However, the Bill does give Scottish Ministers the powers to amend regulations relating to priority offences in Part 2 of Schedule 6. I think government amendments in the other place mean that Scotland’s hate crime Act will not affect what people can and cannot say online in the rest of the UK, since it was passed by a devolved authority without the Government’s consent. But I believe a loophole remains whereby a future Government could simply approve that or any other law that has been passed in Holyrood, so Nicola Sturgeon could still become the content moderator for the whole of the UK. How should online providers therefore respond where there are differences in legislation across the four nations?

Access to data is clearly essential to ensure that the dynamic landscape of online harms is understood in the Scottish context. I am thinking of issues for rural and remote communities, how online platforms respond to sectarian content, or understanding the online experiences of people with drug or gambling addictions. Are there any differences across the UK? In terms of the transparency reports required by the Bill, will Ofcom be able to see that data in a nation-specific way?

Scotland has a thriving gaming industry, but it is unclear if there is industry awareness or involvement in this Bill and its implications for gaming platforms. I declare an interest as a board member of Creative Scotland. Will the Minister elaborate on what consultation there has been with gaming companies across the UK, including in Scotland?

The Bill rightly recognises that children are a vulnerable group, but has thought been given to the definition of a child throughout the United Kingdom, because in Scotland it varies. The 2014 Act includes all children up to the age of 18, but there are instances where someone aged 16 may legally be treated as an adult, and other circumstances where disabled or care-experienced children can be included in children’s services until their 26th birthday. As other noble Lords have mentioned, people with physical disabilities, learning disabilities or mental health issues, people in care, people with addictions and many more of all ages could be classed as being vulnerable online. What is the data on looking at online harms from purely an age perspective?

I note that there is an obligation to consult disabled people on decision-making, but should not all those within the CRPD definition of disabled be within the scope of the consultation requirements of the Bill? I would like to see the consultation duties under Clauses 36 and 69 strengthened. I also support calls from other noble Lords for requirements to be placed on providers to risk-assess their customer base, and to provide basic safety settings set to “on” by default.

However, I do welcome the Bill. It is, as others have said, a landmark piece of legislation. We will be far better off with it on the statute book than we are now, but I hope we can get some of the details right as it makes its way through your Lordships’ House.

My Lords, I very much welcome the Bill to the House, late as it may be. Like the noble Lord, Lord Storey, I know very little about the internet. I certainly know less about the sites we are talking about tonight, but I know that some of those sites are destroying our young people and poisoning their minds.

Age verification in terms of safety for children online was first debated in 2016. It is remarkable that a child who was eight years old when this proposal was first put forward will be an adult when the protections that they deserve will finally be in place. Many children will have been allowed to live through their formative years being exposed to untold harm online. A child who was eight in 2016 could be potentially in the grips of addiction by the time that age verification is made a legal requirement. This did not need to be the case. The harms suffered by many teenagers over the last seven or eight years could have been avoided. As the noble Lord, Lord Dodds, indicated, if the Government had only done what they were supposed to do and implemented age verification through Part 3 of the Digital Economy Act, children could have been protected.

According to research by DCMS, 80% of children aged six to 12 have viewed something harmful online, while over 50% of teenagers believe that they have accessed illegal content online. We cannot allow children to continue to be let down. We need to ensure that robust age verification is in place, but, more than that, we need to get it right. While the Bill is a step in the right direction, I think there is a lot more work to be done. This is an important Bill, but it is also important for this House to get it right.

First, we need to ensure that age verification on pornography sites will be brought in on this occasion. The Government cannot be allowed to sidestep this issue. A clear commencement clause needs to be placed into the Bill.

Secondly, we need to ensure that age verification is in place, not just for children accessing pornography; the age of those acting in content must also be verified. User-to-user pornography websites are simply a hotbed of illegal material and children surviving sexual abuse that need to be stopped by the Bill. If it includes clear age verification for those involved in the content, it will be a valuable tool in ensuring that children are not exploited online.

Thirdly, we need to move to protect women and girls from the effects of online pornography. Harmful pornography content promotes violence against women and girls. Evidence shows that excessive consumption of some legal pornography material can result in offenders viewing illegal child sexual abuse material. As increasingly extreme pornography becomes available on mainstream sites, the threshold of what is acceptable is very much lowered.

There is much to support in this legislation: it offers an opportunity to ensure that we can protect women and children. I look forward to working with others to ensure that we can deliver on these important protections.

My Lords, I welcome the Bill’s commitment to protecting children online, yet, like many noble Lords, I fear that it is not yet robust enough. I am extremely concerned about the current unfettered access that children have to online pornography—pornography that is violent, misogynistic, racist and deeply disturbing in its content. For example, analysis of videos recommended to first-time users on three of the most popular porn sites, Pornhub, Xvideos, and xHamster, found that one in every eight titles described sexual activities that constitutes sexual violence as defined by the WHO. In most cases, that violence is perpetrated against women, and, in those videos, the women respond to that violence either with pleasure or neutrality. Incest was the most frequent form of sexual violence recommended to users. The second most common category recommended was that of physical aggression and sexual assault. This is not the dark web, or some far corner of the internet; these are mainstream porn sites, and they are currently accessed every month by 1.4 million UK children.

Research released yesterday by the Children’s Commissioner states that the average age at which children first see pornography is 13. Accessing this brutal and degrading content has a devastating impact on their psychological, emotional, neurological and sexual well-being. I recommend a YouTube video called “Raised on Porn”, if noble Lords want to see the damage it can do. Boys grow up to believe that girls must enjoy violent sex acts, and girls are growing up to believe that they must enjoy painful and humiliating acts, such as anal sex and strangulation. Anecdotal evidence shows that the 5,000% increase in the number of girls going through puberty now wishing to identify as male is at least partly driven by seeing this vile porn and coming to the conclusion that they would rather not be women if that is what sex involves. Yet the Online Safety Bill does little to address this. While it includes regulations on age verification, pornography will not be defined as a primary priority content until secondary legislation. Furthermore, according to the Ofcom implementation road map, multiple consultations and processes also need to be undertaken. As we have heard from other noble Lords, it may not be until 2027 or 2028 before we see robust age verification. We cannot wait that long.

Mainstream porn consists of acutely hardcore content, which, although it does not meet the narrow definition of illegal content, is none the less extremely harmful, especially when viewed by children. Depictions of sexual coercion, abuse and exploitation of vulnerable women and children, the incest porn I have already mentioned, humiliation, punishment, torture and pain, and child sexual abuse are commonplace. In the offline world, that content would be prohibited under the British Board of Film Classification guidelines, yet it remains online with no provisions in the Bill to address the staggering gap between the online and offline worlds. That is despite the Government recognising in their own research that

“there is substantial evidence of an association between the use of pornography and harmful sexual attitudes and behaviours towards women.”

Amending the Bill to protect women and children need not be a difficult task. As many noble Lords have mentioned, provisions were made to address those issues in the Digital Economy Act, although they were not implemented. We must not make those mistakes again and allow the Bill to pass without ensuring robust protections for children and society at large.

My Lords, like other noble Lords, I welcome the Bill and the opportunity it presents, if it is strengthened, to address the many online harms which have been so eloquently outlined by colleagues around the Chamber. My starting point is ensuring that we do all we can to minimise the harms to those at risk of, or with, eating disorders. I declare an interest as the mother of a young adult daughter with anorexia, which is, as many noble Lords will know, the deadliest of any of the mental health diseases.

The evidence is clear of the harm that online content can do to people at risk of, or with, eating disorders and to exacerbate their conditions. Beat, the leading eating disorder charity, undertook research last year of 255 people with lived experience of eating disorders and their carers, which found that 91% of people with lived experience of eating disorders have encountered content which was harmful to their eating disorder condition. This includes sites that are innocuously called “pro-ana” and “pro-mia”, which encourage extreme starvation and extreme bulimic behaviours by people, and content for which there is no warning if you see an image or a video of body checking or of people being fed by naso-gastric tubes, as though that were something to be applauded.

As the noble Baroness, Lady Gohir, said, there are images which have been digitally enhanced to present pictures of people’s bodies that are completely unrealistic but are not labelled as digitally retouched—unlike in France, where the law states that those commercial images do have to be if digitally retouched. It was good that the celebrity influencer Kylie Jenner, who may not be known to all noble Lords in this place, was called out last week in the media for digitally editing pictures of her body on social media. That is the right thing to do and this Government should be doing more on that, including in the Bill.

It is not just that those images are out there. Other noble Lords have made the point that there are algorithms which constantly pump them at people. People with eating disorders feel bombarded by a constant stream of triggering images, content and advertising which feeds eating disorder behaviours and conditions. Obviously, you can recover from eating disorders; that is good news for those of us who know sufferers. But having talked to my daughter Rose about it, I know that what happens on TikTok is that your feed page—I think it is called a “for you” page—obviously is based on the content you have been looking at over the last period. It will suck you back down into an eating disorder, just when those people with mental disorders are trying to get out. For the reasons given so well by other noble Members, algorithms need to be touched on.

I fully support what the noble Baroness, Lady Hollins, said about the insufficiency of the protections for adults. I cannot get my daughter to put food in her mouth to nourish her; how on earth am I going to get her or other vulnerable people to opt out in a different way from the social media content which is harming them?

It was excellent that Vicky Ford promoted this issue in the other place, as the noble Baroness, Lady Morgan, mentioned. She had some suggestions about ensuring that eating disorders were treated on a par and that the obligations on social media companies applied regarding those disorders. I support that entirely and hope that I can work with other Members from around the House to ensure that we can shut that loophole down, so that people with eating disorders, and their carers, are given another tool in the fight against these vicious and deadly diseases.

My Lords, no Bill that we can devise now can ever offer a complete solution to every online risk while balancing all the competing priorities. But I welcome this Bill as a critical early step down a hard road, because it sets up an adaptive structure to respond to emerging technologies and needs. We heard the phrase “living legislation earlier”, and that expresses it very well.

I would like to offer three examples of what some of our future challenges in this space are going to be. The first is AI: given the sheer quantity of content and genuine difficulty of some decisions that have to be made about that content, no platform can make the delicate judgments at the huge speed and scale we are looking for without automated algorithmic solutions. That inevitably comes to mean AI overseeing our activity and, given the vast behaviour-modification capabilities of the large platforms, AI coming to modify our collective behaviour in ways we are unlikely to understand or control. However benignly intended, the results of such developments are far-reaching and unknowable.

Secondly, there is digital identity. We have heard some brilliant contributions about this and I think we can all agree that a cornerstone of dangerous behaviour online is anonymity. Age-verification checks are easily circumvented today and I wholly support, of course, the analysis and proposals of my noble friend Lord Bethell in this area. There is a broad principle here: that online behaviour should be guided by the same constraints as behaviour in real life. In my view, the only real way to bring that about is by requiring a digital identity for everyone. That is not to say that everybody has to identify themselves at all times, but they should be identifiable if the need arises and should criminal or dangerous behaviour take place.

Thirdly, and lastly, there is the issue of enforcement, particularly in Web 3.0. We can foresee the enforcement of compliance by well-known platforms led and owned by household names, but we are increasingly going to see more and more online services provided by much larger numbers of decentralised platforms, run by so- called DAOs—decentralised autonomous organisations. These are organisations without boards and managers; they do not necessarily have employees or even bank accounts. They are going to require very different levers of enforcement. Put simply, you cannot easily apply criminal sanctions with neither owners to arrest nor real assets to seize. I am pleased that the Minister and his team have already started thinking about these organisations, as discussed at the briefing that he kindly arranged last week.

Of course, worrying about these future problems in no way diminishes the very real challenges of the present, which have been covered so movingly in our debate today. However, none of the risks to online safety is going to get any easier to manage. The growth of malicious activity and extremism will be multiplied by the greater emotional intensity of the immersive experience that will be enabled by some of the virtual reality technologies that we are now starting to see come on to the market. With this Bill, we are making a bold and important start, which I welcome, but I fear that the harder part of our journey lies ahead of us.

My Lords, as a former journalist and online publisher, I welcome this Bill. It is imperfect, of course, but it is much needed, as can be seen by the deeply disturbing data around online media and its impact on the young and vulnerable.

I believe that the free-for-all nature of the digital age requires us to build far more rigorous layers of protection and regulation than ever before. I say this having benefited myself hugely as an entrepreneur both from freedom of expression and information and from the extraordinary reach of online media. However, in this digital era of business to consumer as well as consumer to consumer—whether via social media or user-generated content—we cannot let freedom of expression trump all else. Users need protection from not just unscrupulous organisations but each other.

This is about addressing damaging behaviour and unhealthy lifestyles that the digital world has engendered, especially among the young—and not just in the well-documented areas of online hate, abuse and bullying but around increasing obesity, falling levels of exercise, declining levels of academic performance and, some argue, lower economic productivity. The need for teaching media literacy could not be any more clear.

As the noble Baroness, Lady Benjamin, pointed out, children come across pornography online from as young as the age of seven and more than 50% of 11 to 13 year-olds in the UK have accessed pornography. Even more staggering to me is that, by the age of 18, 79% of young people have been exposed to violent porn. Such exposure has contributed to surging increases in mental ill-health, child abuse, bullying, violence and sexual assault. The evidence is overwhelming—just read the research from the NSPCC, Barnardo’s, Parent Zone and many others.

This issue is so serious and widespread that, like the noble Lord, Lord Bethell, the noble Baroness, Lady Ritchie, and many others, I believe that, although it is well intentioned, the tightening regulation and guidance in Part 5 of the Bill do not go far enough. We must grasp the nettle and insist that all pornography sites, without exception, adopt robust, and ideally standardised, age-verification technology, as we have for online gambling. Given the nature of many of these sites, can we really trust them to abide by a new code of practice and expect Ofcom to enforce it effectively?

I accept that social media is a much more complex beast, but here too I believe the time has come for age verification. TikTok claims to have a minimum age requirement of 13, yet Ofcom reports that 42% of our eight to 12 year-olds are on that platform. Much of the content is unsuitable for children, but TikTok monetises traffic whatever your age. Elon Musk take note: more than 40% of young people in this country have accessed porn via Twitter.

The majority of our children and grandchildren are being exposed to a barrage of disturbing content at the most formative stages of their lives. They need protection. Yes, the implementation of mandatory AV will depress audiences and revenues. It will raise privacy issues and there will be loopholes. But in my view the social benefits far outweigh the costs.

My Lords, I declare an interest as an investor, adviser and entrepreneur in the technology industry, as set out in the register. I welcome the Bill, although it is big, complicated and difficult to understand. However, there is a real risk that we are regulating the past instead of thinking about the imminent threats of the future. I will focus on two very narrow issues.

First, as immersive environments—metaverses—become more and more popular, we have the issue of actions in these environments. This is not content, photos, videos and texts but actions. Anyone can buy a haptic glove and touch inappropriately a child in a metaverse. The child would not even know that he or she was being abused. In fact, you can buy a vest with 30 different sensors so that it feels real.

There is a whole community around age play, where adults play the role of children. This is happening right now. There are virtual reality brothels with child avatars. What if that avatar has the likeness of a real child? How much of a likeness is a likeness? What if it has the name of a real child?

This industry, particularly the immersive industry, needs guidance, and it has said that it does. I hope the Minister can elaborate on the guidance that will be provided to it. On this note, I express my support for my noble friend Lord Bethell’s amendment on mandatory age verification. The technology exists and it works. There is no reason why it should not be implemented.

Secondly, the Bill defines user-generated content very clearly, but it is completely silent on machine-generated content. What if an AI chatbot was to groom or abuse a child? Who is responsible: the owner of the dataset on which that AI has been trained or the server on which that data has been transmitted? I thought: why not ask a chatbot? I did. It said, “Yes, an AI bot can abuse a child but liability for abuse by AI bots is a complex issue.” So AI bots are already trying to get out of liability for future abuse. That is what the machines are telling us today.

There are a lot of great things in the Bill and I support it very much, but we cannot always play catch-up with technology. I hope the Minister will tell us how this guidance will be provided as it relates to emerging technologies.

My Lords, what a privilege it is to follow so many distinguished noble Lords, and in particular the speech of my noble friend Lord Sarfraz.

Our deliberation is ever more imperative, given the latest heightened and explicit concerns stated in the Children’s Commissioner’s report on young people and pornography. As a social worker, I have witnessed first hand the devastating aftermath and the lifelong impact of child sexual abuse and violence, long before children possessed the internet in their hands and pockets, and big tech companies used algorithms for content, evidently enticing children towards dangerous cycles of harm.

The backdrop of this Bill is the aim to make Britain a global leader for digital business while ensuring that it is the safest place online, and to navigate the balance between protecting consumers and stimulating innovation in a fast-moving digital world that can preserve safety and enhance freedom of speech without compromising one or the other. At a time of deepening and detrimental public services cuts, achieving best outcomes for the legislation will require considerable financial resources, impactful monitoring and skilled oversight. The Bill will address many of the anomalies and flaws that plague the current system and stop its preventing harms, as authoritatively detailed by my noble friend Lady Kidron. I salute her and acknowledge the presence of Mr Ian Russell. I too was horrified on hearing the briefing.

I welcome this opportunity to ensure that platforms are held accountable for their interactions with users, even chatbots. I also value innovations, emerging technologies and the right to freedom of expression, but, cognisant of the evident danger presented by many platforms, government cannot be the protector of profits to the detriment of young minds and lives. Big tech platforms have resisted remedies, including identity assurance and age verification. Therefore, I will definitely be supporting my noble friend Lady Kidron and the noble Lord, Lord Bethell—unless Government concede beforehand. I cannot support preserving anonymity as a shield of protection for any subscribers, content-makers and users. If we end anonymity, it will be a huge leap in monitoring harmful content and traceability.

As co-chair of the APPG on the metaverse and web 3.0, working with stakeholders in this space, I recognise the power of innovative technology as a force for good. At the same time, as a social worker, I want to scream out loud its threat. If we do not address the gravity of harmful content that normalises children viewing extreme material on violent pornography, diet, sexual exploitation, self-harm and revenge porn that shapes their young minds, we will have abdicated our role as protector of standards. Statistics from the NSPCC, Barnardo’s, Big Brother Watch and the Internet Watch Foundation, on unprecedented and worsening levels of online access to material on grooming, sexual abuse, self-harm, bulimia and millions of unfiltered pieces of content, make horrific reading.

Many NGOs are fearful that Ofcom is not fit to address these complex matters without incorporating children’s views into regulatory decision-making and, more importantly, to counterbalance the big tech lobbyists, their infinite resources and proficiency at skewing available data on child safety. I agree that Ofcom needs strengthening and must work with safeguarding experts to uphold standards, but it must also identify and respond to the evolving nature of harms across multidimensional interconnected platforms and a plethora of small, less well-moderated operators to ensure that children’s safety and voices are not drowned out by large tech companies whose business models are not predicated on protection and thorough risk assessment.

The APPG on the metaverse and web 3.0 wants to see children’s views prioritised, and we intend to incorporate them into our reports and programmes. Our partners are also considering the balance between safeguarding and the opportunity for increasing diversity within social media companies, recognising the historical disfranchisement and exclusion prevalent within the first wave of the social media revolution platforms. There is promise on the horizon from the newcomers —the smaller, emerging generation of conscientious organisations and companies that are proactive in engaging local communities, and inclusive in their approach. Widening participation will require institutions to consider workforce training in this sector.

Finally, the online safety Bill may not prevent all children accessing harmful content as this new virtual space becomes more sophisticated within the infinite metaverse and artificial intelligence space. We will need to respond smartly to this rapidly shifting national and international digital environment of emerging technology, placing the safety of children at the forefront of our consideration.

My Lords, no one who has heard Molly Russell’s story can be in any doubt about the need to better protect young people online, and I join others in paying tribute to her family for their tireless campaign.

As we have heard, vulnerability online does not evaporate on turning 18. Some adults will be at risk because mental illness, disability, autism, learning disabilities or even age leaves them unable to protect themselves from harm. Others will be vulnerable only at certain times, or in relation to specific issues. The “legal but harmful” provisions were not perfect, but stripping out adult safety duties—when, as the Minister himself said, three-quarters of adults are fearful of going online—is a backward step.

With category 1 services no longer required to assess risks to adults, it is hard to agree when the Minister says this will be

“a regulatory regime which has safety at its heart”.

Without risk assessments, how will platforms work out what they need to include in their terms and conditions? How will users make informed choices? How will the effectiveness of user empowerment tools be measured? Without the real-time information that risk assessments provide, how will the regulator stay on top of new risks, and advise the Secretary of State accordingly?

Instead, the Bill sets out duties for category 1 services to write and enforce their own terms and conditions—they will be “author, judge and jury”, to quote my noble friend Lady Kidron—and to provide tools that empower adult users to increase control over types of content listed at Clause 12. Harms arise and spread quickly online, yet this list is static, and it has significant gaps already. Harmful or false health content is missing, as are harms relating to body image, despite evidence linking body shaming to eating disorders, self-harm and suicide ideation. Smaller sites that target specific vulnerabilities, including suicide forums, would fall outside scope of these duties.

Describing this list as “content over which users may wish to increase control” is euphemism at its best. This is not content some might consider in poor taste, or a bit off-colour. This is content encouraging or promoting suicide, self-harm and eating disorders. It is content that is abusive or incites hate on the basis of race, ethnicity, religion, disability, sex, gender, sexual orientation and misogyny, which evidence connects directly to violence against women and girls.

And yet tools to hide this content will be off by default, meaning that people at the point of crisis, those seeking advice on self-harm or starvation, will need to find and activate those settings when they may well be in an affected mental state that leaves them unable to self-protect. The complexities of addiction and eating disorders disempower choice, undermining the very basis on which Clause 12 is built.

We heard it said today that all adults, given the tools, are capable of protecting themselves from online abuse and harm. This is just not true. Of course, many adults are fortunate to be able to do so, but as my noble and expert friends Lady Hollins and Lady Finlay explained, there are many adults who, for reasons of vulnerability or capacity, cannot do so. Requiring the tools to be on by default would protect adults at risk and cause no hardship whatever to those who are not: a rational adult will be as capable of finding the off button as the one that turns them on.

Last week, Ministers defended the current approach on the basis that failing to give all users equal access to all material constitutes a chilling effect on freedom of expression. It is surely more chilling that this Bill introduces a regime in which content promoting suicide, self-harm, or racist and misogynistic abuse is deemed acceptable, and is openly available, harming some but influencing many, as long as the platform in question gives users an option to turn it off. This cannot be right, and I very much hope Ministers will go back and reconsider.

When the Government committed to making the UK the safest place in the world to be online, I find it hard to believe that this is the environment that they had in mind.

My Lords, it is hard to think of something new to say at the end of such a long debate, but I am going to try. I am helped by the fact that I find myself, very unusually, somewhat out of harmony with the temper of the debate in your Lordships’ House over the course of this afternoon and evening. I rather felt at some points that I had wandered into a conference of medieval clerics trying to work out what measures to take to mitigate the harmful effects of the invention of moveable type.

In fact, it probably does require an almost religious level of faith to believe that the measures we are discussing are actually going to work, given what my noble friends Lord Camrose and Lord Sarfraz have said about the agility of the cyber world and the avidity of its users for content. Now, we all want to protect children, and if what had come forward had been a Bill which made it a criminal offence to display or allow to be displayed to children specified harmful content—with condign punishment—we would all, I am sure, have rallied around that and rejoiced. That is how we would have dealt with this 50 years ago. But instead we have this; this is not a short Bill doing that.

Let me make three brief points about the Bill in the time we have available. The first is a general one about public administration. We seem to be wedded to the notion that the way in which we should be running large parts of the life of the country is through regulators rather than law, and that the independence of those regulators must be sacrosanct. In a different part of your Lordships’ House, there has been discussion in the last few days of the Financial Services and Markets Bill in Committee. There, of course, we have been discussing the systemic failures of regulators—that is, the box ticking, the legalism, the regulatory capture and the emergence of the interests of the regulator and how they motivate them. None the less, we carry on giving more and more powers. Ofcom is going to be one of the largest regulators and one of the most important in our lives, and it is going to be wholly unaccountable. We are not going to be happy about that.

The second point I want to make is that the Bill represents a serious threat to freedom of speech. This is not contentious; the Front Bench admits it. The Minister says that it is going to strike the right balance. I have seen very little evidence in the Bill, or indeed in the course of the day’s debate, that that balance is going to be struck at all, let alone in what I might consider the right place—and what I might consider the right place might not be what others consider it to be. These are highly contentious issues; we will be hiving them off to an unaccountable regulator, in effect, at the end.

The third point that I want to make, because I think that I am possibly going to come in under my four minutes, is that I did vote Conservative at the last general election; I always have. But that does not mean that I subscribe to every jot and tittle of the manifesto; in particular, I do not think that I ever signed up to live in a country that was the safest place in the world to be on the internet. If I had, I would have moved to China already, where nothing is ever out of place on the internet. That is all I have to say, and I shall be supporting amendments that move in the general direction that I have indicated.

My Lords, I also welcome this belated Bill, particularly its protections for children. All of us, I think, very sadly over the last number of years, have witnessed the outcome of inquiries into a litany of horrific crimes against children, through decades of historic institutional abuse. That abuse, sadly, was facilitated by inaction. That might have been motivated by ignorance and complacency rather than by being complicit, but nevertheless society as a whole let down those generations of children. We must make sure that history does not repeat itself.

I am the first to admit that the internet can be a great tool for value. We saw during the recent pandemic, for example, the contribution that the internet was able to make to education, in a way that would have been inconceivable a decade ago. But there is also no doubt that there is a very negative side to the internet, through body-shaming, trolling, misogyny, anti-Semitism, racism and incitement to violence—among many other things—and most particularly, the damage that occurs to our young people and the tragic loss of life in cases such as Molly Russell and others. That is why I particularly support the amendments that will be brought forward by the noble Baroness, Lady Kidron, and by the noble Lord, Lord Bethell.

We know that early exposure to pornography, particularly violent pornography, leads to degrading and destructive attitudes and actions, especially towards women, as has been highlighted by the Government themselves in their reports on violence against women and girls. Therefore, we must take definitive action to be able to counteract that.

As the noble Lord, Lord Bethell, has indicated, there are three particular areas on which we have to intervene when it comes to amendments. First, we need robust age verification, both for users and—as has been highlighted by a previous speaker—for those involved in the porn industry itself and are producing it. We know that the porn industry, and many within it, are not exactly protective of those whom they employ, and we must make sure that everything is done to protect everyone who is underage.

Secondly, I believe that, in regulations, we need to have what is clear and consistent: consistent in a single definition of pornography; consistent that what is illegal offline is mirrored by what is illegal online; and consistent in ensuring that high standards apply across all platforms. I join with a number of speakers today who have been highly critical of large, conglomerate tech companies and the approach that they take, but that should not blind us to the fact that some of the vilest imagery, some of the vilest abuse and some of the vilest actions happen on small platforms as well. We must make sure that we hold all platforms equally to a high standard.

Thirdly, we must ensure, particularly in terms of age verification, that we see swift and early implementation. I agree that, in terms of the detail of regulation, Ofcom is best placed to be able to deliver that. However, we also know that the full package of regulations that Ofcom will produce might be three, four or five years away. We cannot allow that level of destruction to take place in the meantime. That means, particularly in regard to age verification, that we need to see that early and swift intervention.

In conclusion, I think we have a good Bill, but it could be a better Bill. Collectively, we must ensure that it is the best Bill that is possible, so that we do not face a situation in which, for families and for children—either of the current generation or of future ones—we let them down in the way that the previous generations have been let down.

My Lords, it has been well observed that the social media companies and YouTube are now the public square—only, of course, they are not public at all but privately owned companies whose primary concern is to earn profits for their shareholders in the normal way. Against this, the reality is that we have effectively outsourced our censorship to Silicon Valley AI bots, and, faced with the prospect of enormous fines for breaching the new laws, these private companies are going to programme the AI bots on the side of caution. The bots, after all, have no way of knowing the legal cut-off point of mature teenagers and immature adults, and, of course, the censoring bot has no sense of irony or satire or parody or context.

The threat to free speech will therefore now come from two sources. First, as we have seen from the Twitter files, from Big Brother Watch’s Ministry of Truth report and from Matt Hancock’s diaries, Governments covertly lean on the platforms to suppress dissent from the official line. Secondly, the threat will come from these private companies instructing the bots not to go anywhere near anything that might upset the Governments. In this sense, both have crossed the line between attacking disinformation and attacking dissent, and the ability to express dissent is at the core of freedom of speech. We therefore now have the reality of big government and big tech working together to suppress freedom of expression.

I am looking forward to initiating or supporting any amendments that will check the power of government or big tech to shut down legitimate questioning voices, which, from the Great Barrington declaration to the Wuhan lab-leak theory to the ineffectiveness of masks to the collateral damage caused by the lockdowns, over and over again have often proved to be closer to the truth than the official government line at the time.

I would like to use the few moments left to support resistance to restricting end-to-end encryption, to support the initiatives of the noble Lord, Lord Bethell, on age verification, and to follow the lead of the noble Baroness, Lady Kidron, on child safety initiatives.

My Lords, I thank the Minister for his detailed introduction and his considerable engagement on the Bill to date. This has been a comprehensive, heartfelt and moving debate, with a great deal of cross-party agreement about how we must regulate social media going forward. With 66 speakers, however, I sadly will not be able to mention many significant contributors by name.

It has been a long and winding road to get to this point, as noble Lords have pointed out. As the Minister pointed out, along with a number of other noble Lords today, I sat on the Joint Committee which reported as far back as December 2021. I share the disappointment of many that we are not further along with the Bill. It is still a huge matter of regret that the Government chose not to implement Part 3 of the DEA in 2019. Not only, as mentioned by many, have we had a cavalcade of five Culture Secretaries, we have diverged a long way from the 2019 White Paper with its concept of the overarching duty of care. I share the regret that the Government have chosen to inflict last-minute radical surgery on the Bill to satisfy the, in my view, unjustified concerns of a very small number in their own party.

Ian Russell—I pay tribute to him, like other noble Lords—and the Samaritans are right that this is a major watering down of the Bill. Mr Russell showed us just this week how Molly had received thousands and thousands of posts, driven at her by the tech firms’ algorithms, which were harmful but would still be classed as legal. The noble Lord, Lord Russell, graphically described some of that material. As he said, if the regulator does not have powers around that content, there will be more tragedies like Molly’s.

The case for proper regulation of harms on social media was made eloquently to us in the Joint Committee by Ian and by witnesses such Edleen John of the FA and Frances Haugen, the Facebook whistleblower. The introduction to our report makes it clear that the key issue is the business model of the platforms, as described by the noble Lords, Lord Knight and Lord Mitchell, and the behaviour of their algorithms, which personalise and can amplify harmful content. A long line of reports by Select Committees and all-party groups have rightly concluded that regulation is absolutely necessary given the failure of the platforms even today to address these systemic issues. I am afraid I do not agree with the noble Baroness, Lady Bennett; being a digital native is absolutely no protection—if indeed there is such a thing as a digital native.

We will be examining the Bill and amendments proposed to it in a cross-party spirit of constructive criticism on these Benches. I hope the Government will respond likewise. The tests we will apply include: effective protections for children and vulnerable adults; transparency of systems and power for Ofcom to get to grips with the algorithms underlying them; that regulation is practical and privacy protecting; that online behaviour is treated on all fours with offline; and that there is a limitation of powers of the Secretary of State. We recognise the theme which has come through very strongly today: the importance of media literacy.

Given that there is, as a result of the changes to the Bill, increased emphasis on illegal content, we welcome the new offences, recommended in the main by the Law Commission, such as hate and communication crimes. We welcome Zach’s law, against sending flashing images or “epilepsy trolling”, as it is called, campaigned for by the Epilepsy Society, which is now in Clause 164 of the Bill. We welcome too the proposal to make an offence of encouraging self-harm. I hope that more is to come along the lines requested by my noble friend Lady Parminter.

There are many other forms of behaviour which are not and will not be illegal, and which may, according to terms of service, be entirely legal, but are in fact harmful. The terms of service of a platform acquire great importance as a result of these changes. Without “legal but harmful” regulation, platforms’ terms of service may not reflect the risks to adults on that service, and I was delighted to hear what the noble Baroness, Lady Stowell, had to say on this. That is why there must be a duty on platforms to undertake and publish risk and impact assessments on the outcomes of their terms of service and the use of their user empowerment tools, so that Ofcom can clearly evaluate the impact of their design and insist on changes or adherence to terms of service, issue revised codes or argue for more powers as necessary, for all the reasons set out by the noble Baroness, Lady Gohir, and my noble friend Lady Parminter.

The provisions around user empowerment tools have now become of the utmost importance as a result of these changes. However, as Carnegie, the Antisemitism Policy Trust, and many noble Lords today have said, these should be on by default to protect those suffering from poor mental health or who might lack faculty to turn them on.

Time is short today, so I can give only a snapshot of where else we on these Benches—and those on others, I hope—will be focusing in Committee. The current wording around “content of democratic importance” and “journalistic content” creates a lack of clarity for moderation processes. As recommended by the Joint Committee, these definitions should be replaced with a single statutory requirement to protect content where there are reasonable grounds to believe it will be in the public interest, as supported by the Equality and Human Rights Commission.

There has been a considerable amount of focus on children today, and there are a number of amendments that have clearly gained a huge amount of support around the House, and from the Children’s Charities’ Coalition on Internet Safety. They were so well articulated by the noble Baroness, Lady Kidron. I will not adumbrate them, but they include that children’s harms should be specified in the Bill, that we should include reference to the UN convention, and that there should be provisions to prevent online grooming. Particularly in the light of what we heard this week, we absolutely support those campaigning to ensure that the Bill provides for coroners to have access to children’s social media accounts after their deaths. We want to see Minister Scully’s promise to look at this translate into a firm government amendment.

We also need to expressly future-proof the Bill. It is not at all clear whether the Bill will be adequate to regulate and keep safe children in the metaverse. One has only to read the recent Institution of Engineering and Technology report, Safeguarding the Metaverse, and the report of the online CSA covert intelligence team, to realise that it is a real problem. We really need to make sure that we get the Bill right from this point of view.

As far as pornography is concerned, if we needed any more convincing of the issues surrounding children’s access to pornography, the recent research by the Children’s Commissioner, mentioned by several noble Lords, is the absolute clincher. It underlines the importance of the concerns of the coalition of charities, the noble Lord, Lord Bethell, and many other speakers today, who believe that the Online Safety Bill does not go far enough to prevent children accessing harmful pornographic content. We look forward to debating those amendments when they are put forward by the noble Lord, Lord Bethell.

We need to move swiftly on Part 5 in particular. The call to have a clear time limit to bring it in within six months of the Bill becoming law is an absolutely reasonable and essential demand.

We need to enshrine age-assurance principles in the Bill. The Minister is very well aware of issues relating to the Secretary of State’s powers. They have been mentioned by a number of noble Lords, and we need to get them right. Some can be mitigated by further and better parliamentary scrutiny, but many should simply be omitted from the Bill.

As has been mentioned by a number of noble Lords, there is huge regret around media literacy. We need to ensure that there is a whole-of-government approach to media literacy, with specific objectives set for not only Ofcom but the Government itself. I am sure that the noble Lord, Lord Stevenson, will be talking about an independent ombudsman.

End-to-end encryption has also come up; of course, that needs protecting. Clause 110 on the requirement by Ofcom to use accredited technology could lead to a requirement for continual surveillance. We need to correct that as well.

There is a lot in the Bill. We need to debate and tackle the issue of misinformation in due course, but this may not be the Bill for it. There are issues around what we know about the solutions to misinformation and disinformation and the operation of algorithmic amplification.

The code for violence against women and girls has been mentioned. I look forward to debating that and making sure that Ofcom has the power and the duty to produce a code which will protect women and girls against that kind of abuse online. We will no doubt consider criminal sanctions against senior managers as well. A Joint Committee, modelled on the Joint Committee on Human Rights, to ensure that the Bill is future-proofed along the lines that the noble Lords, Lord Inglewood and Lord Balfe, talked about is highly desirable.

The Minister was very clear in his opening remarks about what amendments he intends to table in Committee. I hope that he has others under consideration and that he will be in listening mode with regard to the changes that the House has said it wants to see today. Subject to getting the Bill in the right shape, these Benches are very keen to see early implementation of its provisions. I hope that the Ofcom implementation road map will be revised, and that the Minister can say something about that. It is clearly the desire of noble Lords all around the House to improve the Bill, but we also want to see it safely through the House so that the long-delayed implementation can start.

This Bill is almost certainly not going to be the last word on the subject, as the noble Baroness, Lady Merron, very clearly said at the beginning of this debate, but it is a vital start. I am glad to say that today we have started in a very effective way.

My Lords, I start by apologising for having absented myself during part of the debate. I promise those noble Lords whose speeches I missed that I will read them very carefully. The reason is slightly self-serving: I decided to tear up my speech, for two reasons. First, I suddenly realised that the noble Lord, Lord Clement-Jones, being the brilliant lawyer he has been and still is, would probably say everything I was going to say but better—and indeed that has proved to be the case. There is not much point in me boring noble Lords by trying to repeat what he said. The list of items I had is almost exactly identical. I did not give it to him, but we had an exchange of views before the debate, so I was not surprised by that. I will come on to that point.

Secondly, I want to deal with the noble Lord, Lord Hastings, who challenged me in my very junior position as an acting Front-Bencher to commit the Labour Government to a future policy on media education. I am sure the noble Lord opposite will not out-trump me on this one, but I cannot do that. I will, however, get back at him, because I will say that the BBC has never been in better shape than when he was the PR person operating at the front of it. In fact, I do not think it has recovered since he left, so there you are. I think that what he said was quite important.

One of the big, strange things about media education—in fact, this is true of most education policy—is that it is very hard to get changes in the education system. That is partly because it is now so disparate and uncoordinated in many ways, through policy, that you cannot say that there is a core curriculum, or that it will include media education and that that will be examined on the following days, as they might do in other countries such as France. The Government should think very hard about how they might take forward the idea from the noble Lord, Lord Hastings. My answer is that you have to examine media education or assess it in some way, otherwise schools will not care about it. This is really a question for Ofsted, not Ofcom. In a sense, the Government have got it right there, but if we could put some pressure on Ofsted to include in its assessment of all schools—indeed, all education at that level—some form of ability to assess whether media education is meeting the needs of Ofcom or the needs of society, we might make some progress. Let us work on that together.

I declare an interest as a member of the Joint Committee on the pre-legislative scrutiny of the Bill. That was a wonderful experience and has been mentioned by others. I am also a former member of the Communications and Digital Committee. I should also drop in that I am veteran of the Digital Economy Act—much mentioned today—so I have been there, got the scars and am aware of the issues very clearly.

The second reason why I wanted to tear up my speech was that it seemed to me that, as the noble Lord, Lord Clement-Jones, said, there has been an extraordinary amount of agreement on the issues facing the House in trying to get this Bill right. They are not fuelled in any sense by party-political points, because we have no political issue in this, and I do not think the Liberal Democrats or Cross Benches have. We are talking about an issue that we want to do together. I will come back at the end with a proposal, which I think is slightly novel, for how we might take advantage of that. I do not think we want to get ourselves into a situation of antagonism—firing amendments across the Dispatch Box during Committee —because we are broadly agreed about where we want to go. Yes, there are difference of detail, but we have to think about it. I want to come back to that as an issue—and that was what I was doing while I was away.

I want to go back to the introduction to the Joint Committee report, as I would have done in my original speech, because it says so much about what we have been doing in the last two or three years. Self-regulation of online services had failed. While the online world has revolutionised our lives and created many benefits, underlying systems designed to service business models based on data harvesting and micro-targeted advertising shape the way we experience it. Algorithms, invisible to the public, decide what we see, hear and experience. For some service providers, this means valuing the engagement of users at all cost, regardless of what holds their attention. This can result in amplifying the false over the true, the extreme over the considered, and the harmful over the benign. The human cost can be counted in mass murders in Myanmar, intensive care beds full of unvaccinated Covid-19 patients, insurrection at the US Capitol, and teenagers sent down rabbit holes of content promoting self-harm, eating disorders and suicide. As we have learned, we do not just mean teenagers—there are others involved in that. As the noble Baroness, Lady Kidron, and others have reminded us, too many children have suffered from infractions of this type. I pay tribute, again, to Ian Russell—who is still with us—for his campaign and for his extraordinary willingness to share his story. We all owe him a great debt.

These points, already made in other speeches, are important; they are at the heart of what this is about. This is about finding a way of organising what we all value, want and need, in a way that will allow us to get the benefits from it without paying the price that we already are. This debate, in the best traditions of this House, has brought a lot of views to bear on this, but, as I have tried to explain, it seems to me that a lot of them are very similar. There are differences and one or two outliers, but the points made broadly point in one direction: that the Bill is nearly there. It needs a little work and a bit of polishing and it will get over the finishing line.

The Bill needs to be in its best shape—there is no doubt about that—but we could identify alongside it the other issues that we will need to return to in future. We should not worry about that; I think we have all agreed that there will be other opportunities to do so. As we were reminded by the noble Lord, Lord Black, and others, there are other elements that also need to go ahead, and we should be thinking harder about them—the DMU and the need for competition in this whole area. As I said, the noble Lord, Lord Clement-Jones, gave a very good summary of all the issues; I will not run through them again because it was exactly what I would have said myself.

We are in a very strange situation. There is no political divide and we all want the same things: we want the Bill improved and we want to see it pass as soon as possible. I am assuming that the Government will work with us on that—that is an assumption, because that is not the normal way it goes. I am assuming also that they recognise that there are one or two quite sensible compromises to be made—again, that is not a given, but I am getting a few nods that suggest that it might be the case. From this side, I cannot think of any issue that I have heard today, or in any of the discussions we have had recently about this Bill—and they have gone on for a number of years—that we would push to ping-pong. That is very unusual.

I suggest that we try to work together on getting the best Bill we can—while, of course, going through the various stages, because these things all eventually have to go back into the Bill—avoiding the war of attrition approach that so often bedevils the work we do here. Such an approach is important when there are big political issues at stake, but there are not, so let us use that and try to move forward. I would like to get together quite quickly and identify the policies we can move on together, and to take a route forward which will minimise the votes and the dissent and yet deliver the Bill, let us hope, by Report. That is a big ask; I do not think it has been done, except during wartime. But we are at war—at war with these people who are trying to run our lives, and we should try to get together and defeat them. It is unusual, but we live in unusual times. I look forward to hearing from the Minister.

My Lords, I am grateful to the very many noble Lords who have spoken this afternoon and this evening. They have spoken with passion—we heard that in the voices of so many—about their own experiences, the experiences of their families and the experiences of far too many of our fellow subjects, who have harrowing examples of the need for this Bill. But noble Lords have also spoken with cool-headed precision and forensic care about the aspects of the Bill that demand our careful scrutiny. Both hearts and heads are needed to make this Bill worth the wait.

I am very grateful for the strong consensus that has come through in noble Lords’ speeches on the need to make this Bill law and to do so quickly, and therefore to do our work of scrutiny diligently and speedily. I am grateful for the very generous and public-spirited offer the noble Lord, Lord Stevenson, has just issued. I, too, would like to make this not a party-political matter; it is not and has not been in the speeches we have heard today. The work of your Lordships’ House is to consider these matters in detail and without party politics intruding, and it would be very good if we could proceed on the basis of collaboration, co-operation and, on occasion, compromise.

In that spirit, I should say at the outset that I share the challenge faced by the noble Lords, Lord Clement-Jones and Lord Stevenson. Given that so many speakers have chosen to contribute, I will not be able to cover or acknowledge everyone who has spoken. I shall undoubtedly have to write on many of the issues to provide the technical detail that the matters they have raised deserve. It is my intention to write to noble Lords and invite them to join a series of meetings to look in depth at some of the themes and areas between now and Committee, so that as a group we can have well-informed discussions in Committee. I shall write with details suggesting some of those themes, and if noble Lords feel that I have missed any, or particular areas they would like to continue to talk about, please let me know and I will be happy to facilitate those.

I want to touch on a few of the issues raised today. I shall not repeat some of the points I made in my opening speech, given the hour. Many noble Lords raised the very troubling issue of children accessing pornography online, and I want to talk about that initially. The Government share the concerns raised about the lack of protections for children from this harmful and deeply unsuitable content. That is why the Bill introduces world-leading protections for children from online pornography. The Bill will cover all online sites offering pornography, including commercial pornography sites, social media, video-sharing platforms and fora, as well as search engines, which play a significant role in enabling children to access harmful and age-inappropriate content online. These companies will have to prevent children accessing pornography or face huge fines. To ensure that children are protected from this content, companies will need to put in place measures such as age verification, or demonstrate that the approach they are taking delivers the same level of protection for children.

While the Bill does not mandate that companies use specific technologies to comply with these new duties, in order to ensure that the Bill is properly future-proofed, we expect Ofcom to take a robust approach to sites which pose the highest risk of harm to children, including sites hosting online pornography. That may include directing the use of age verification technologies. Age verification is also referred to in the Bill. This is to make clear that these are measures that the Government expect to be used for complying with the duties under Part 3 and Part 5 to protect children from online pornography. Our intention is to have the regime operational as soon as possible after Royal Assent, while ensuring that the necessary preparations are completed effectively and that service providers understand what is expected of them. We are working very closely with Ofcom to ensure this.

The noble Lord, Lord Morrow, and others asked about putting age verification in the Bill more clearly, as was the case with the Digital Economy Act. The Online Safety Bill includes references to age assurance and age verification in the way I have just set out. That is to make clear that these are measures which the Government expect to be used for complying with the duties where proportionate to do so. While age assurance and age verification are referred to in the Bill, the Government do not mandate the use of specific approaches or technologies. That is similar to the approach taken in the Digital Economy Act, which did not mandate the use of a particular technology either.

I think my noble friend Lord Bethell prefers the definition of pornography in Part 3 of the Digital Economy Act. There is already a robust definition of “pornographic content” in this Bill which is more straightforward for providers and Ofcom to apply. That is important. The definition we have used is similar to the definition of pornographic content used in existing legislation such as the Coroners and Justice Act 2009. It is also in line with the approach being taken by Ofcom to regulate UK-established video-sharing platforms, meaning that the industry will already have familiarity with this definition and that Ofcom will already have experience in regulating content which meets this definition. That means it can take action more swiftly. However, I have heard the very large number of noble Lords who are inclined to support the work that my noble friend is doing in the amendments he has proposed. I am grateful for the time he has already dedicated to conversations with the Secretary of State and me on this and look forward to discussing it in more detail with him between now and Committee.

A number of noble Lords, including the noble Baronesses, Lady Finlay of Llandaff and Lady Kennedy of The Shaws, talked about algorithms. All platforms will need to undertake risk assessments for illegal content. Services likely to be accessed by children will need to undertake a children’s risk assessment to ensure they understand the risks associated with their services. That includes taking into account in particular the risk of algorithms used by their service. In addition, the Bill includes powers to ensure that Ofcom is able effectively to assess whether companies are fulfilling their regulatory requirements, including in relation to the operating of their algorithms. Ofcom will have the power to require information from companies about the operation of their algorithms and the power to investigate non-compliance as well as the power to interview employees. It will have the power to require regulated service providers to undergo a skilled persons report and to audit company systems and processes, including in relation to their algorithms.

The noble Baroness, Lady Kidron, rightly received many tributes for her years of work in relation to so many aspects of this Bill. She pressed me on bereaved parents’ access to data and, as she knows, it is a complex issue. I am very grateful to her for the time she has given to the meetings that the Secretary of State and I have had with her and with colleagues from the Ministry of Justice on this issue, which we continue to look at very carefully. We acknowledge the distress that some parents have indeed experienced in situations such as this and we will continue to work with her and the Ministry of Justice very carefully to assess this matter, mindful of its complexities which, of course, were something the Joint Committee grappled with as well.

The noble Baroness, Lady Featherstone, my noble friend Lady Wyld and others focused on the new cyberflashing offence and suggested that a consent-based approach would be preferable. The Law Commission looked at that in drawing up its proposals for action in this area. The Law Commission’s report raised concerns about the nature of consent in instant messaging conversations, particularly where there are misjudged attempts at humour or intimacy that could particularly affect young people. There is a risk, which we will want to explore in Committee, of overcriminalising young people. That is why the Government have brought forward proposals based on the Law Commission’s work. If noble Lords are finding it difficult to see the Law Commission’s reports, I am very happy to draw them to their attention so that they can benefit from the consultation and thought it conducted on this difficult issue.

The noble Baroness, Lady Gohir, talked about the impact on body image of edited images in advertising. Through its work on the online advertising programme, DCMS is considering how the Government should approach advertisements that contribute to body image concerns. A consultation on this programme closed in June 2022. We are currently analysing the responses to the consultation and developing policy. Where there is harmful user-generated content related to body image that risks having an adverse physical or psychological impact on children, the Online Safety Bill will require platforms to take action against that. Under the Bill’s existing risk assessment duties, regulated services are required to consider how media literacy can be used to mitigate harm for child users. That could include using content provenance technology, which can empower people to identify when content has been digitally altered in ways such as the noble Baroness mentioned.

A number of noble Lords focused on the changes made in relation to the so-called “legal but harmful” measures to ensure that adults have the tools they need to curate and control their experience online. In particular, noble Lords suggested that removing the requirement for companies to conduct risk assessments in relation to a list of priority content harmful to adults would reduce protections available for users. I do not agree with that assessment. The new duties will empower adult users to make informed choices about the services they use and to protect themselves on the largest platforms. The new duties will require the largest platforms to enforce all their terms of service regarding the moderation of user-generated content, not just the categories of content covered in a list in secondary legislation. The largest platforms already prohibit the most abusive and harmful content. Under the new duties, platforms will be required to keep their promises to users and take action to remove it.

There was rightly particular focus on vulnerable adult users. The noble Baronesses, Lady Hollins and Lady Campbell of Surbiton, and others spoke powerfully about that. The Bill will give vulnerable adult users, including people with disabilities, greater control over their online experience too. When using a category 1 service, they will be able to reduce their exposure to online abuse and hatred by having tools to limit the likelihood of their encountering such content or to alert them to the nature of it. They will also have greater control over content that promotes, encourages or provides instructions for suicide, self-harm and eating disorders. User reporting and redress provisions must be easy to access by all users, including people with a disability and adults with caring responsibilities who are providing assistance. Ofcom is of course subject to the public sector equality duty as well, so when performing its duties, including writing its codes of practice, it will need to take into account the ways in which people with protected characteristics, including people with disabilities, can be affected. I would be very happy to meet the noble Baronesses and others on this important matter.

The noble Lords, Lord Hastings of Scarisbrick and Lord Londesborough, and others talked about media literacy. The Government fully recognise the importance of that in achieving online safety. As well as ensuring that companies take action to keep users safe through this Bill, we are taking steps to educate and empower them to make safe and informed choices online. First, the Bill strengthens Ofcom’s existing media literacy functions. Media literacy is included in Ofcom’s new transparency reporting and information-gathering powers. In response to recommendations from the Joint Committee, the legislation also now specifies media literacy in the risk-assessment duties. In July 2021, DCMS published the online media literacy strategy, which sets out our ambition to improve national media literacy. We have committed to publishing annual action plans in each financial year until 2024-25, setting out our plans to deliver that. Furthermore, in December of that year, Ofcom published Ofcom’s Approach to Online Media Literacy, which includes an ambitious range of work focusing on media literacy.

Your Lordships’ House is, understandably, not generally enthusiastic about secondary legislation and secondary legislative powers, so I was grateful for the recognition by many tonight of the importance of providing for them in certain specific instances through this Bill. As the noble Lord, Lord Brooke of Alverthorpe, put it, there may be loopholes that Parliament wishes to close, and quickly. My noble friend Lord Inglewood spoke of the need for “living legislation”, and it is important to stress, as many have, that this Bill seeks to be technology-neutral—not specifying particular technological approaches that may quickly become obsolete—in order to cater for new threats and challenges as yet not envisaged. Some of those threats and challenges were alluded to in the powerful speech of my noble friend Lord Sarfraz. I know noble Lords will scrutinise those secondary powers carefully. I can tell my noble friend that the Bill does apply to companies that enable users to share content online or interact with each other, as well as to search services. That includes a broad range of services, including the metaverse. Where haptics enable user interaction, companies must take action. The Bill is also clear that content generated by bots is in scope where it interacts with user-generated content such as on Twitter, but not if the bot is controlled by or on behalf of the service, such as providing customer services for a particular site.

Given the range of secondary powers and the changing technological landscape, a number of noble Lords understandably focused on the need for post-legislative scrutiny. The Bill has undoubtedly benefited from pre-legislative scrutiny. As I said to my noble friend Lady Stowell of Beeston in her committee last week, we remain open-minded on the best way of doing that. We must ensure that once this regime is in force, it has the impact we all want it to have. Ongoing parliamentary scrutiny will be vital in ensuring that is the case. We do not intend to legislate for a new committee, not least because it is for Parliament itself to decide what committees it sets up. But I welcome further views on how we ensure that we have effective parliamentary scrutiny, and I look forward to discussing that in Committee. We have also made it very clear that the Secretary of State will undertake a review of the effectiveness of the regime between two and five years after it comes into force, producing a report that will then be laid in Parliament, thus providing a statutory opportunity for Parliament to scrutinise the effectiveness of the legislation.

My noble friend and other members of her committee followed up with a letter to me about the Secretary of State’s powers. I shall reply to that letter in detail and make that available to all noble Lords to see ahead of Committee. This is ground-breaking legislation, and we have to balance the need for regulatory independence with the appropriate oversight for Parliament and the Government. In particular, concerns were raised about the Secretary of State’s power of direction in Clause 39. Ofcom’s independence and expertise will be of utmost importance here, but the very broad nature of online harms means that there may be subjects that go beyond its expertise and remit as a regulator. That was echoed by Ofcom itself when giving evidence to the Joint Committee: it noted that there will clearly be some issues in respect of which the Government have access to expertise and information that the regulator does not, such as national security.

The framework in the Bill ensures that Parliament will always have the final say on codes of practice, and the use of the affirmative procedure will further ensure that there is an increased level of scrutiny in the exceptional cases where that element of the power is used. As I said, I know that we will look at that in detail in Committee.

My noble friend Lord Black of Brentwood, quoting Stanley Baldwin, talked about the protections for journalistic content. He and others are right that the free press is a cornerstone of British democracy; that is why the Bill has been designed to protect press and media freedom and why it includes robust provisions to ensure that people can continue to access diverse news sources online. Category 1 companies will have a new duty to safeguard all journalistic content shared on their platform, which includes citizen journalism. Platforms will need to put systems and processes in place to protect journalistic content, and they must enforce their terms of service consistently across all moderation and in relation to journalistic content. They will also need to put in place expedited appeals processes for producers of journalistic content.

The noble Baroness, Lady Anderson of Stoke-on-Trent, spoke powerfully about the appalling abuse and threats of violence she sustained in her democratic duties, and the noble Baroness, Lady Foster, spoke powerfully of the way in which that is putting off people, particularly women, from going into public life. The noble Baroness, Lady Anderson, asked about a specific issue: the automatic deletion of material and the implications for prosecution. We have been mindful of the scenario where malicious users post threatening content which they then delete themselves, and of the burden on services that retaining that information in bulk would cause. We have also been mindful of the imperative to ensure that illegal content cannot be shared and amplified online by being left there. The retention of data for law enforcement purposes is strictly regulated, particularly through the Investigatory Powers Act, which the noble Lord, Lord Anderson of Ipswich, is reviewing at the request of the Home Secretary. I suggest that the noble Baroness and I meet to speak about that in detail, mindful of that ongoing review and the need to bring people to justice.

The noble Baroness, Lady Chakrabarti, asked about sex for rent. Existing offences can be used to prosecute that practice, including Sections 52 and 53 of the Sexual Offences Act 2003, both of which are listed as priority offences in Schedule 7 to the Bill. As a result, all in-scope services must take proactive measures to prevent people being exposed to such content.

The noble Lord, Lord Davies of Brixton, and others talked about scams. The largest and most popular platforms and search engines—category 1 and category 2A services in the Bill—will have a duty to prevent paid-for fraudulent adverts appearing on their services, making it harder for fraudsters to advertise scams online. We know that that can be a particularly devastating crime. The online advertising programme builds on this duty in the Bill and will look at the role of the whole advertising system in relation to fraud, as well as the full gamut of other harms which are caused.

My noble friend Lady Fraser talked about the devolution aspects, which we will certainly look at. Internet services are a reserved matter for the UK Government. The list of priority offences in Schedule 7 can be updated only by the Secretary of State, subject to approval by this Parliament.

The right reverend Prelate the Bishop of Manchester asked about regulatory co-operation, and we recognise the importance of that. Ofcom has existing and strong relationships with other regulators, such as the ICO and the CMA, which has been supported and strengthened by the establishment of the Digital Regulation Cooperation Forum in 2020. We have used the Bill to strengthen Ofcom’s ability to work closely with, and to disclose information to, other regulatory bodies. Clause 104 ensures that Ofcom can do that, and the Bill also requires Ofcom to consult the Information Commissioner.

I do not want to go on at undue length—I am mindful of the fact that we will have detailed debates on all these issues and many more in Committee—but I wish to conclude by reiterating my thanks to all noble Lords, including the many who were not able to speak today but to whom I have already spoken outside the Chamber. They all continue to engage constructively with this legislation to ensure that it meets our shared objectives of protecting children and giving people a safe experience online. I look forward to working with noble Lords in that continued spirit.

My noble friend Lady Morgan of Cotes admitted to being one of the cavalcade of Secretaries of State who have worked on this Bill; I pay tribute to her work both in and out of office. I am pleased that my right honourable friend the Secretary of State was here to observe part of our debate today and, like all noble Lords, I am humbled that Ian Russell has been here to follow our debate in its entirety. The experience of his family and too many others must remain uppermost in our minds as we carry out our duty on the Bill before us; I know that it will be. We have an important task before us, and I look forward to getting to it.

Bill read a second time.