Committee (4th Day) (Continued)
Clause 11: Safety duties protecting children
Debate on Amendment 29 resumed.
My Lords, I support the noble Baroness, Lady Ritchie, in her search to make it clear that we do not need to take a proportionate approach to pornography. I would be delighted if the Minister could indicate in his reply that the Government will accept the age-assurance amendments in group 22 that are coming shortly, which make it clear that porn on any regulated service, under Part 3 or Part 5, should be behind an age gate.
In making the case for that, I want to say very briefly that, after the second day of Committee, I received a call from a working barrister who represented 90 young men accused of serious sexual assault. Each was a student and many were in their first year. A large proportion of the incidents had taken place during freshers’ week. She rang to make sure that we understood that, while what each and every one of them had done was indefensible, these men were also victims. As children brought up on porn, they believed that their sexual violence was normal—indeed, they told her that they thought that was what young women enjoyed and wanted. On this issue there is no proportionality.
My Lords, I also support Amendments 29, 83 and 103 from the noble Baroness, Lady Ritchie. As currently drafted, the Bill makes frequent reference to Ofcom taking into account
“the size and capacity of … a service”
when it determines the extent of the measures a site should apply to protect children. We have discussed size on previous days; I am conscious that the point has been made in part, but I hope the Committee will forgive me if I repeat it clearly. When it comes to pornography and other harms to children, size does matter. As I have said many times recently, porn is porn no matter the size of the website or publisher involved with it. It does not matter whether it is run by a huge company such as MindGeek or out of a shed in London or Romania by a small gang of people. The harm of the content to children is still exactly the same.
Our particular concern is that, if the regulations from Ofcom are applied to the bigger companies, that will create a lot of space for smaller organisations which are not bending to the regulations to try to gain a competitive advantage over the larger players and occupy that space. That is the concern of the bigger players. They are very open to age verification; what concerns them is that they will face an unequal, unlevel playing field. It is a classic concern of bigger players facing regulation in the market: that bad actors will gain competitive advantage. We should be very cognisant of that when thinking about how the regulations on age verification for porn will be applied. Therefore, the measures should be applied in proportion to the risk of harm to children posed by a porn site, not in proportion to the site’s financial capacity or the impact on its revenues of basic protections for children.
In this, we are applying basic, real-world principles to the internet. We are denying its commonly held exceptionalism, which I think we are all a bit tired of. We are applying the same principles that you might apply in the real world, for instance, to a kindergarten, play centre, village church hall, local pub, corner shop or any other kind of business that brings itself in front of children. In other words, if a company cannot afford to implement or does not seem capable of implementing measures that protect children, it should not be permitted by law to have a face in front of the general public. That is the principle that we apply in the real world, and that is the principle we should be applying on the internet.
Allowing a dimension of proportionality to apply to pornography cases creates an enormous loophole in the legislation, which at best will delay enforcement for particular sites when it is litigated and at worst will disable regulatory action completely. That is why I support the amendments in the name of the noble Baroness, Lady Ritchie.
My Lords, the proposers of these amendments have made a very good case to answer. My only reservation is that I think there are rather more subtle and proportionate ways of dealing with this—I take on board entirely what the noble Lord, Lord Bethell, says.
I keep coming back to the deliberations that we had in the Joint Committee. We said:
“All statutory requirements on user-to-user services, for both adults and children, should also apply to Internet Society Services likely to be accessed by children, as defined by the Age Appropriate Design Code”.
This goes back to the test that we described earlier, to
“ensure all pornographic websites would have to prevent children from accessing their content”,
and back to that definition,
“likely to be accessed by children”.
The Government keep resisting this aspect, but it is a really important way of making sure that we deal with this proportionately. We are going to have this discussion about minimum age-assurance standards. Rather than simply saying, “It has to be age verification”, if we had a set of principles for age assurance, which can encompass a number of different tools and approaches, that would also help with the proportionality of what we are talking about.
The Government responded to the point we made about age assurance. The noble Baroness, Lady Kidron, was pretty persuasive in saying that we should take this on board in our Joint Committee report, and she had a Private Member’s Bill at the ready to show us the wording, but the Government came back and said:
“The Committee’s recommendations stress the importance of the use of age assurance being proportionate to the risk that a service presents”.
They have accepted that this would be a proportionate way of dealing with it, so this is not black and white. My reservation is that there is a better way of dealing with this than purely driving through these three or four amendments, but there is definitely a case for the Government to answer on this.
My Lords, I think the whole Committee is grateful to my noble friend Lady Ritchie for introducing these amendments so well.
Clearly, there is a problem. The anecdote from the noble Baroness, Lady Kidron, about the call she had had with the barrister relating to those freshers’ week offences, and the sense that people were both offenders and victims, underscored that. In my Second Reading speech I alluded to the problem of the volume of young people accessing pornography on Twitter, and we see the same on Reddit, Discord and a number of other platforms. As the noble Baroness said, it is changing what so many young people perceive to be normal about sexual relationships, and that has to be addressed.
Ofcom very helpfully provided a technical briefing on age assurance and age verification for Members of your Lordships’ House—clearly it did not persuade everybody, otherwise we would not be having this debate. Like the noble Lord, Lord Clement-Jones, I am interested in this issue of whether it is proportionate to require age verification, rather than age assurance.
For example, on Amendment 83 in my noble friend’s name in respect of search, I was trying to work out in my own mind how that would work. If someone used search to look for pornographic content and put in an appropriate set of keywords but was not logged in—so the platform would not know who they are—and if age verification was required, would they be interrupted with a requirement to go through an age-verification service before the search results were served up? Would the search results be served up but without the thumbnails of images and with some of the content suppressed? I am just not quite sure what the user experience would be like with a strict age-verification regime being used, for example, in respect of search services.
My Lords, some light can be shone on that question by thinking a little about what the gambling industry has been through in the last few years as age verification has got tougher in that area. To answer the noble Lord’s question, if someone does not log into their search and looks for a gambling site, they can find it, but when they come to try to place a bet, that is when age verification is required.
That is right. What is interesting about that useful intervention from the noble Lord, Lord Bethell, is that that kind of gets search off the hook in respect of gambling. You are okay to follow the link from the search engine, but then you are age-gated at the point of the content. Clearly, with thumbnail images and so on in search, we need something better than that. The Bill requires something better than that already; should we go further? My question to the Minister is whether this could be similar to the discussion we had with the noble Baroness, Lady Harding, around non-mandatory codes and alternative methods. I thought that the Minister’s response in that case was quite helpful.
Could it be that if Part 3 and category 2A services chose to use age verification, they could be certain that they are compliant with their duties to protect children from pornographic and equivalent harmful content, but if they chose age-assurance techniques, it would then be on them to show Ofcom evidence of how that alternative method would still provide the equivalent protection? That would leave the flexibility of age assurance; it would not require age verification but would still set the same bar. I merely offer that in an attempt to be helpful to the Minister, in the spirit of where the Joint Committee and the noble Lord, Lord Clement-Jones, were coming from. I look forward to the Minister’s reply.
Yes; the noble Baroness is right. She has pointed out in other discussions I have been party to that, for example, gaming technology that looks at the movement of the player can quite accurately work out from their musculoskeletal behaviour, I assume, the age of the gamer. So there are alternative methods. Our challenge is to ensure that if they are to be used, we will get the equivalent of age verification or better. I now hand over to the Minister.
My Lords, I think those last two comments were what are known in court as leading questions.
As the noble Baroness, Lady Ritchie of Downpatrick, said herself, some of the ground covered in this short debate was covered in previous groups, and I am conscious that we have a later grouping where we will cover it again, including some of the points that were made just now. I therefore hope that noble Lords will understand if I restrict myself at this point to Amendments 29, 83 and 103, tabled by the noble Baroness, Lady Ritchie.
These amendments seek to mandate age verification for pornographic content on a user-to-user or search service, regardless of the size and capacity of a service provider. The amendments also seek to remove the requirement on Ofcom to have regard to proportionality and technical feasibility when setting out measures for providers on pornographic content in codes of practice. While keeping children safe online is the top priority for the Online Safety Bill, the principle of proportionate, risk-based regulation is also fundamental to the Bill’s framework. It is the Government’s considered opinion that the Bill as drafted already strikes the correct balance between these two.
The provisions in the Bill on proportionality are important to ensure that the requirements in the child-safety duties are tailored to the size and capacity of providers. It is also essential that measures in codes of practice are technically feasible. This will ensure that the regulatory framework as a whole is workable for service providers and enforceable by Ofcom. I reassure your Lordships that the smaller providers or providers with less capacity are still required to meet the child safety duties where their services pose a risk to children. They will need to put in place sufficiently stringent systems and processes that reflect the level of risk on their services, and will need to make sure that these systems and processes achieve the required outcomes of the child safety duty. Wherever in the Bill they are regulated, companies will need to take steps to ensure that they cannot offer pornographic content online to those who should not see it. Ofcom will set out in its code of practice the steps that companies in the scope of Part 3 can take to comply with their duties under the Bill, and will take a robust approach to sites that pose the greatest risk of harm to children, including sites hosting online pornography.
The passage of the Bill should be taken as a clear message to providers that they need to begin preparing for regulation now—indeed, many are. Responsible providers should already be factoring in regulatory compliance as part of their business costs. Ofcom will continue to work with providers to ensure that the transition to the new regulatory framework will be as smooth as possible.
The Government expect companies to use age-verification technologies to prevent children accessing services that pose the highest risk of harm to children, such as online pornography. The Bill will not mandate that companies use specific technologies to comply with new duties because, as noble Lords have heard me say before, what is most effective in preventing children accessing pornography today might not be equally effective in future. In addition, age verification might not always be the most appropriate or effective approach for user-to-user companies to comply with their duties. For instance, if a user-to-user service, such as a particular social medium, does not allow pornography under its terms of service, measures such as strengthening content moderation and user reporting would be more appropriate and effective for protecting children than age verification. This would allow content to be better detected and taken down, instead of restricting children from seeing content which is not allowed on the service in the first place. Companies may also use another approach if it is proportionate to the findings of the child safety risk assessment and a provider’s size and capacity. This is an important element to ensure that the regulatory framework remains risk-based and proportionate.
In addition, the amendments in the name of the noble Baroness, Lady Ritchie, risk inadvertently shutting children out of large swathes of the internet that are entirely appropriate for them to access. This is because it is impossible totally to eliminate the risk that a single piece of pornography or pornographic material might momentarily appear on a site, even if that site prohibits it and has effective systems in place to prevent it appearing. Her amendments would have the effect of essentially requiring every service to block children through the use of age verification.
Those are the reasons why the amendments before us are not ones that we can accept. Mindful of the fact that we will return to these issues in a future group, I invite the noble Baroness to withdraw her amendment.
My Lords, I thank all noble Lords who have participated in this wide-ranging debate, in which various issues have been raised.
The noble Baroness, Lady Benjamin, made the good point that there needs to be a level playing field between Parts 3 and 5, which I originally raised and which other noble Lords raised on Tuesday of last week. We keep coming back to this point, so I hope that the Minister will take note of it on further reflection before we reach Report. Pornography needs to be regulated on a consistent basis across the Bill.
The noble Baroness, Lady Kidron—I offer my congratulations on her birthday; it was remiss of me not to do so earlier—emphasised the need for clarity and consistency yet again, as well as the effects of pornography, which follow people through their lives, give an unrealistic view of relationships and can lead to increased violence against women. We must always remember that one incident of pornography can plague you for the rest of your life, because it will possibly play on your mind and have indirect or unintended consequences for your life’s passage after that.
The noble Lord, Lord Bethell, talked about equality across the Bill, as well as across websites. He raised yet another great real-world example: if organisations such as schools and nurseries cannot keep people safe, we do not allow them to look after children; if businesses cannot keep children safe, they need to be regulated to do so.
The noble Lord, Lord Clement-Jones, stated that it seems that the view of the Committee is clear: we need principles in the Bill that are universal to keep children safe. That is the clear message throughout the Committee debate so far. There may be a better way, and I hope that we can work with the noble Lord, Lord Clement-Jones, and my noble friend Lord Knight and his colleagues, along with the Government Benches, to achieve that.
My noble friend Lord Knight in his summing up raised an excellent point. Again, I come back to this issue: if we do not have clarity or consistency, none of this work will be as it is intended it should be. If different duties apply and if different levels of proportionality exist, that will only create uncertainty.
The Minister made the point that, with pornography now named as a harm to children, as announced on Thursday of last week, he hoped to consider how consistency is brought across the Bill to ensure that all providers in Parts 3 and 5 will be kept safe from pornography. It seems clear from deliberations in Committee so far that noble Lords do not think that the Bill brings that clarity and consistency. That clearly needs to be addressed and corrected.
This is not about shoving kids out; everyone understands that, despite best efforts, pornography may slip through. It is about consistency. I ask the Minister during the interregnum period between now and the end of Committee and the beginning of Report to further reflect on the issues to do with the need for clarity and consistency in dealing with pornography across the Bill. I beg leave to withdraw the amendment.
Amendment 29 withdrawn.
Amendments 30 to 32A not moved.
Clause 11, as amended, agreed.
33: After Clause 11, insert the following new Clause—
“Offence of failing to comply with a relevant duty
(1) The provider of a service to whom a relevant duty applies commits an offence if the provider fails to comply with the duty.(2) In the application of sections 178(2) and 179(5) to an offence under this section (where the offence has been committed with the consent or connivance of an officer of the entity or is attributable to any neglect on the part of an officer of the entity) the references in those provisions to an officer of an entity include references to any person who, at the time of the commission of the offence—(a) was (within the meaning of section 93) a senior manager of the entity in relation to the activities of the entity in the course of which the offence was committed; or(b) was a person purporting to act in such a capacity.(3) A person who commits an offence under this section is liable on conviction on indictment to—(a) imprisonment for a term not exceeding two years,(b) a fine, or(c) both.(4) The Secretary of State may by regulations amend the sanctions in subsection (3), and such regulations may—(a) specify the maximum fine under subsection (3)(b), and(b) implement a scale to apply in cases where there have been repeated breaches of a relevant duty.(5) In this section, “relevant duty” means a duty provided for by section 11 of this Act.(6) Regulations under subsection (4) are subject to the affirmative procedure.”Member’s explanatory statement
This new Clause would make it an offence for the provider of a user-to-service not to comply with the safety duties protecting children set out in Clause 11. Where the offence was committed with the consent or connivance of a provider’s senior manager or other officer, or was attributable to their neglect, that person, as well as the entity, would be guilty of the offence.
My noble friend Lord Stevenson apologises that he can no longer be with the Committee, and he apologised to me that I suddenly find myself introducing this amendment. It heads up an important group because it tackles the issue of enforcement and, in essence, how we ensure that Ofcom has all the tools it needs to persuade some of the richest, largest and most litigious companies in the world to comply with the regime we are setting out in the Bill. Amendment 33, which my noble friend tabled and I am moving, sets out an offence of failing to comply with a relevant duty in respect of the child safety duties, if they do so negligently, and that it would be an imprisonable offence for a senior manager or other officer. I recall that those of us who sat on the Joint Committee discussed the data protection regime and whether there could be a similarly designated officer to the data controller in companies in respect of the safety duties with which the company would have to comply.
Clearly, this amendment has now been superseded by the government amendments that were promised, and which I am sure my noble friend was looking to flush out with this amendment. Flushed they are, so I will not go into any great detail about Amendment 33, because it is better to give time to the Minister to clarify the Government’s intentions. I shall listen carefully to him, as I will to the noble Lord, Lord Curry, who has great expertise in better regulation and who, I am sure, through talking to his amendments, will give us the benefit of his wisdom on how we can make this stick.
That leaves my Amendment 219, which in essence is about the supply chain that regulated companies use. I am grateful to the noble Lords, Lord Mann and Lord Austin, and the noble Baroness, Lady Deech, for putting their names to the amendment. Their enthusiasm did not run to missing the Arsenal game and coming to support in the Chamber, but that implies great trust in my ability to speak to the amendment, for which I accept the responsibility and compliment.
The amendment was inspired by a meeting that some Members of your Lordships’ House and the other place had in an all-party group that was looking, in particular, at the problems of the incel culture online. We heard from various organisations about how incel culture relates to anti-Semitism and misogyny, and how such content proliferates and circulates around the web. It became clear that it is fairly commonplace to use things such as cloud services to store the content and that the links are then shared on platforms. On the mainstream platforms, there might be spaces where, under the regime we are discussing under the Bill now that we have got rid of the controversial “legal but harmful” category, this content might be seen to be relatively benign, certainly in the category of freedom of expression, but starts to capture the interest of the target demographic for it. They are then taken off by links into smaller, less regulated sites and then, in turn, by links into cloud services where the real harmful content is hosted.
Therefore, by way of what reads as an exceptionally complicated and difficult amendment in respect of entities A, B and C, we are trying to understand whether it is possible to bring in those elements of the supply chain, of the technical infrastructure, that are used to disseminate hateful content. Such content too often leads to young men taking their own lives and to the sort of harm that we saw in Plymouth, where that young man went on the rampage and killed a number of people. His MP was one of the Members of Parliament at that meeting. That is what I want to explore with Amendment 219, which opens the possibility for this regime to ensure that well-resourced platforms cannot hide behind other elements of the infrastructure to evade their responsibilities.
My Lords, I beg the forbearance of the Committee because, despite the best efforts of the Whips, this group includes two major issues that I must tackle.
Starting with senior management liability, I thank the Minister and the entire ministerial team for their engagement on this big and important subject. I am enormously proud of the technology sector and the enormous benefits that it has brought to the economy and to society. I remain a massive champion of innovation and technology in the round. However, senior executives in the technology sphere have had a long-standing blind spot. Their manifesto is that the internet is somehow different from the rest of the real world and that nothing must stand on its way. My noble friend Lord Moylan gave that pony quite a generous trot round the arena, so I will not go through it again, but when it comes to children, they have consistently failed to take seriously their safeguarding responsibilities.
I spoke in Committee last week of my experience at the Ministry of Sound. When I saw the internet in the late 1990s, I immediately saw a wonderful opportunity to target children, to sell to them, to get past their parents and normal regulation, and to get into their homes and their wallets. Lots of other people had the same thought, and for a long time we have let them do what they like. This dereliction of their duty of care has led to significant consequences, and the noble Lord, Lord Russell, spoke very movingly about that. Those consequences are increasing all the time because of the take-up of mobile phones and computers by ever younger children. That has got to stop, and it is why we are here. That is why we have this Bill—to stop those consequences.
To change this, we cannot rely just on rhetoric, fines and self-regulation. We tried that, the experiment has failed, and we must try a different approach. We found that exhortations and a playing-it-nicely approach failed in the financial sector before the financial crisis. We remember the massive economic and societal costs of that failure. Likewise, in the tech sector, senior managers of firms big and small must be properly incentivised and held accountable for identifying and mitigating risks to children in a systematic way. That is why introducing senior management liability for child safety transgressions is critical. Senior management must be accountable for ensuring that child safety permeates the company and be held responsible when risks of serious harm arise or gross failures take place. Just think how the banks have changed their attitude since the financial crisis because of senior liability.
I am pleased that the Government have laid their own amendment, Amendment 200A. I commend the Minister for bringing that forward and am extremely grateful to him and to the whole team for their engagement around this issue. The government amendment creates a new offence, holding senior managers accountable for failure to comply with confirmation decisions from Ofcom relating to protecting children from harmful content. I hope that my noble friend will agree that it is making Ofcom’s job easier by providing clear consequences for the non-enforcement of such decisions.
It is a very good amendment, but there are some gaps, and I would like to address those. It is worrying that the government amendment does not cover duties related to tackling child sexual exploitation and abuse. As it stands, this amendment is a half-measure which fails to hold senior managers liable for the most severe abuse online. Child sexual abuse and exploitation offences are at a record high, as we heard earlier. NSPCC research shows that there has been an 84% rise in online grooming since 2017-18. Tech companies must be held accountable for playing their role in tackling this.
That is why the amendment in my name does the following: first, it increases the scope of the Government’s amendment to make individuals also responsible for confirmation decisions on illegal safety duties related to child sexual abuse and exploitation. Secondly, it brings search services into scope, including both categories of service providers, which is critical for ensuring that a culture of compliance is adopted throughout the sector.
I ask my noble friend the Minister: first, what is the Government’s rationale for not holding senior managers accountable for acting on confirmation decisions related to child sexual abuse offences in their amendment? Secondly, will he commit to discussing this further to ensure that the amendment covers these offences?
I would also like to speak to probing Amendments 220A to 220C in my name. Without effective enforcement, the many words and hours we spend in this House and in the other place talking about the need for robust online safety will come to nothing. Unless we get the enforcement provisions of this Bill right, the aims of the Bill will fail. We know that other content providers will not implement the Bill unless they know that there will be significant penalties for non-compliance. Often companies need to know that the penalty for the consequences of what they do will outweigh doing nothing. For instance, research on the gambling industry has found that unless companies fear the consequences of ineffective enforcement, they simply will not invest in robust technologies.
The amendments in the name of the noble Lord, Lord Curry, in this group are clearly aimed at this very issue, and I express enormous thanks to the noble Lord. Those amendments seek to remove discretion from the regulator and ensure that enforcement action takes place.
As to the amendments to Clause 138, as your Lordships are aware, Ofcom is required to produce guidance on how it intends to enforce the duties and requirements of the Bill. Ofcom has already set out its road map for enforcement, which gives a start to its framework. But these amendments seek to put some flesh on the bones. Amendment 220A states that guidance must cover four important topics. The first is how ancillary services such as payment providers will be used in the enforcement process if the service provider is either free or uses cryptocurrency or other virtual currency. This is absolutely critical for users and providers. It simply cannot be the case that sites which are free or use alternative payment methods could find themselves able to avoid enforcement.
Secondly, guidance should be produced which shows how internet service providers will be used in access restriction orders. The Government have previously suggested that ISPs are less willing to be involved in policing than previously suggested, but without the ability to block sites in contravention of the measures in the Bill, it seems there is a significant gap in the enforcement toolbox. Blocking content is something ISPs already do; they already block sites to protect intellectual property, such as football and other sporting rights. If you try to play a Taylor Swift song, you will find out how effective they are at that. It simply cannot be the case that ISPs would deem TV rights more important than child safety.
The third and fourth topics for guidance in the amendment set out what action Ofcom will take if an ancillary service provider, or a person who provides an access facility, fails to act on a relevant court order. We need to know what will happen when the next court action is ignored.
I hope my noble friend the Minister will be able to provide information on how he envisages enforcement will be implemented under this Bill, and I would be glad to meet him to discuss the matter further.
My Lords, in view of the hour, I will be brief, and I have no interests to declare other than that I have grandchildren. I rise to speak to a number of amendments tabled in my name in this group: Amendments 216A to 216C, 218ZZA to 218ZD and 218BA to 218BC. I do not think I have ever achieved such a comprehensive view of the alphabet in a number of amendments.
These amendments carry a simple message: Ofcom must act decisively and quickly. I have tabled them out of a deep concern that the Bill does not specify timescales or obligations within which Ofcom is required to act. It leaves Ofcom, as the regulator, with huge flexibility and discretion as to when it must take action; some action, indeed, could go on for years.
Phrases such as
“OFCOM may vary a confirmation decision”
“may apply to the court for an order”
are not strong enough, in my view. If unsuitable or harmful material is populating social media sites, the regulator must take action. There is no sense of urgency within the drafting of the Bill. If contravention is taking place, action needs to be taken very quickly. If Ofcom delays taking an action, the harmful influence will continue. If the providers of services know that the regulator will clamp down quickly and severely on those who contravene, they are more likely to comply in the first place.
I was very taken by the earlier comments of the noble Baroness, Lady Harding, about putting additional burdens on Ofcom. These amendments are not designed to put additional burdens on Ofcom; indeed, the noble Lord, Lord Knight, referred to the fact that, for six years, I chaired the Better Regulation Executive. It was my experience that regulators that had a reputation for acting quickly and decisively, and being tough, had a much more compliant base as a consequence.
Noble Lords will be pleased to hear that I do not intend to go through each individual amendment. They all have a single purpose: to require the regulator—in this case, Ofcom—to act when necessary, as quickly as possible within specified timescales; and to toughen up the Bill to reduce the risk of continuous harmful content being promoted on social media.
I hope that the Minister will take these comments in the spirit in which they are intended. They are designed to help Ofcom and to help reduce the continuous adverse influence that many of these companies will propagate if they do not think they will be regulated severely.
My Lords, I understand that, for legislation to have any meaning, it has to have some teeth and you have to be able to enforce it; otherwise, it is a waste of time, especially with something as important as the legislation that we are discussing here.
I am a bit troubled by a number of the themes in these amendments and I therefore want to ask some questions. I saw that the Government had tabled these amendments on senior manager liability, then I read amendments from both the noble Lord, Lord Bethell, and the Labour Party, the Opposition. It seemed to me that even more people would be held liable and responsible as a result. I suppose I have a dread that—even with the supply chain amendment—this means that lots of people are going to be sacked. It seems to me that this might spiral dangerously out of control and everybody could get caught up in a kind of blame game.
I appreciate that I might not have understood, so this is a genuine attempt to do so. I am concerned that these new amendments will force senior managers and, indeed, officers and staff to take an extremely risk-averse approach to content moderation. They now have not only to cover their own backs but to avoid jail. One of my concerns has always been that this will lead to the over-removal of legal speech, and more censorship, so that is a question I would like to ask.
I also want to know how noble Lords think this will lie in relation to the UK being a science and technology superpower. Understandably, some people have argued that these amendments are making the UK a hostile environment for digital investment, and there is something to be balanced up there. Is there a risk that this will lead to the withdrawal of services from the UK? Will it make working for these companies unattractive to British staff? We have already heard that Jimmy Wales has vowed that the Wikimedia foundation will not scrutinise posts in the way demanded by the Bill. Is he going to be thrown in prison, or will Wikipedia pull out? How do we get the balance right?
What is the criminal offence that has a threat of a prison sentence? I might have misunderstood, but a technology company manager could fail to prevent a child or young person encountering legal but none the less allegedly harmful speech, be considered in breach of these amendments and get sent to prison. We have to be very careful that we understand what this harmful speech is, as we discussed previously. The threshold for harm, which encompasses physical and psychological harm, is vast and could mean people going to prison without the precise criminal offence being clear. We talked previously about VPNs. If a tech savvy 17-year-old uses a VPN and accesses some of this harmful material, will someone potentially be criminally liable for that young person getting around the law, find themselves accused of dereliction of duty and become a criminal?
My final question is on penalties. When I was looking at this Bill originally and heard about the eye-watering fines that some Silicon Valley companies might face, I thought, “That will destroy them”. Of course, to them it is the mere blink of an eye, and I do get that. This indicates to me, given the endless conversations we have had on whether size matters, that in this instance size does matter. The same kind of liabilities will be imposed not just on the big Silicon Valley monsters that can bear these fines, but on Mumsnet—or am I missing something? Mumsnet might not be the correct example, but could not smaller platforms face similar liabilities if a young person inadvertently encounters harmful material? It is not all malign people trying to do this; my unintended consequence argument is that I do not want to create criminals when a crime is not really being committed. It is a moral dilemma, and I do understand the issue of enforcement.
I rise very much to support the comments of my noble friend Lord Bethell and, like him, to thank the Minister for bringing forward the government amendments. I will try to address some of the comments the noble Baroness, Lady Fox, has just made.
One must view this as an exercise in working out how one drives culture change in some of the biggest and most powerful organisations in the world. Culture change is really hard. It is hard enough in a company of 10 people, let alone in a company with hundreds of thousands of employees across the world that has more money than a single country. That is what this Bill requires these enormous companies to do: to change the way they operate when they are looking at an inevitably congested, contested technology pipeline, by which I mean—to translate that out of tech speak—they have more work to do than even they can cope with. Every technology company, big or small, always has this problem: more good ideas than their technologists can cope with. They have to prioritise what to fix and what to implement. For the last 15 years, digital companies have prioritised things that drive income, but not the safety of our children. That requires a culture change from the top of the company.
I draw heavily on my experience over the last eight years as a non-exec on the Court of the Bank of England, where I have seen first-hand the implementation of the senior managers regime. I have seen it first-hand because of the extraordinary privilege of a member of the court to sit as an observer in the Prudential Regulatory Authority meetings, but I have also seen it first-hand as the chair of the Bank’s remuneration committee, where I had to sign off as a senior manager. I promise your Lordships that it completely changes your approach to compliance if your own personal name is being used. That makes a huge difference. It does not matter how huge or historic the company, which is why I used the example of the Bank of England; once it is in your name, you behave differently.
We need the very senior managers of these enormous companies to change the way they behave. Sad though this is, I do not believe they will change if it is just about money, as we see time and again. They will change if they have to think about whether, in their own name, they are breaking the law. My understanding of the Government’s amendment—this is where I get to my questions for the Minister—is that they cannot stumble into that by mistake; they have to wilfully ignore the direction of the regulator. I hope the Minister can confirm and explain that.
My other question is: are we confident that the amendment as drafted really tackles the very senior managers? I share some of the concerns of the noble Baroness, Lady Fox: we do not want middle managers, deep in the leviathan of an enormous company, being sacrificial lambs while the company does not really address the issue. We want change from the top to reshape the way these companies think about the trade-offs they have to face. I hope the Minister can clarify that.
My Lords, I support something between the amendments of the noble Lords, Lord Stevenson and Lord Bethell, and the Government. I welcome all three and put on record my thanks to the Government for making a move on this issue.
There are three members of the pre-legislative committee still in the Chamber at this late hour, and I am sure I am not the only one of those three who remembers the excruciating detail in which Suzanne Webb MP, during evidence given with Meta’s head of child safety, established that there was nowhere to report harm, but nowhere—not up a bit, not sideways, not to the C-suite. It was stunning. I have used that clip from the committee’s proceedings several times in schools to show what we do in the House of Lords, because it was fascinating. That fact was also made abundantly clear by Frances Haugen. When we asked her why she took the risk of copying things and walking them out, she said, “There was nowhere to go and no one to talk to”.
Turning to the amendments, like the noble Baroness, Lady Harding, I am concerned about whether we have properly dealt with C-suite reporting and accountability, but I am a hugely enthusiastic supporter of that accountability being in the system. I will be interested to hear the Minister speak to the Government’s amendment, but also to some of the other issues raised by the noble Lord, Lord Knight.
I will comment very briefly on the supply chain and Amendment 219. Doing so, I go back again to Amendment 2, debated last week, which sought to add services not covered by the current scope but which clearly promoted and enabled access to harm and which were also likely to be accessed by children. I have a long quote from the Minister but, because of the hour, I will not read it out. In effect, and to paraphrase, he said, “Don’t worry, they will be caught by the other guys—the search and user-to-user platforms”. If the structure of the Bill means that it is mandatory that the user-to-user and search platforms catch the people in the supply chain, surely it would be a great idea to put that in the Bill absolutely explicitly.
Finally, while I share some of the concerns raised by the noble Baroness, Lady Fox, I repeat my constant reprise of “risk not size”. The size of the fine is related to the turnover of the company, so it is actually proportionate.
My Lords, this has been a really interesting debate. I started out thinking that we were developing quite a lot of clarity. The Government have moved quite a long way since we first started debating senior manager liability, but there is still a bit of fog that needs dispelling—the noble Baronesses, Lady Kidron and Lady Harding, have demonstrated that we are not there yet.
I started off by saying yes to this group, before I got to grips with the government amendments. I broadly thought that Amendment 33, tabled by the noble Lord, Lord Stevenson, and Amendment 182, tabled by the noble Lord, Lord Bethell, were heading in the right direction. However, I was stopped short by Trustpilot’s briefing, which talked about a stepped approach regarding breaches and so on—that is a very strong point. It says that it is important to recognise that not all breaches should carry the same weight. In fact, it is even more than that: certain things should not even be an offence, unless you have been persistent or negligent. We have to be quite mindful as to how you formulate criminal offences.
I very much liked what the noble Lord, Lord Bethell, had to say about the tech view of its own liability. We have all seen articles about tech exceptionalism, and, for some reason, that seems to have taken quite a hold—so we have to dispel that as well. That is why I very much liked what the noble Lord, Lord Curry, said. It seemed to me that that was very much part of a stepped approach, while also being transparent to the object of the exercise and the company involved. That fits very well with the architecture of the Bill.
The noble Baroness, Lady Harding, put her finger on it: the Bill is not absolutely clear. In the Government’s response to the Joint Committee’s report, we were promised that, within three to six months, we would get that senior manager liability. On reading the Bill, I am certainly still a bit foggy about it, and it is quite reassuring that the noble Baroness, Lady Harding, is foggy about it too. Is that senior manager liability definitely there? Will it be there?
The Joint Committee made two other recommendations which I thought made a lot of sense: the obligation to report on risk assessment to the main board of a company, and the appointment of a safety controller, which the noble Lord, Lord Knight, mentioned. Such a controller would make it very clear—as with GDPR, you would have a senior manager who you can fix the duty on.
Like the noble Baroness, Lady Harding, I would very much like to hear from the Minister on the question of personal liability, as well as about Ofcom. It is important that any criminal prosecution is mediated by Ofcom; that is cardinal. You cannot just create criminal offences where you can have a prosecution without the intervention of Ofcom. That is extraordinarily important.
I have just a couple of final points. The noble Baroness, Lady Fox, comes back quite often to this point about regulation being the enemy of innovation. It very much depends what kind of innovation we are talking about. Technology is not necessarily neutral. It depends how the humans who deploy it operate it. In circumstances such as this, where we are talking about children and about smaller platforms that can do harm, I have no qualms about having regulation or indeed criminal liability. That is a really important factor. We are talking about a really important area.
I very strongly support Amendment 219. It deals with a really important aspect which is completely missing from the Bill. I have a splendid briefing here, which I am not going to read out, but it is all about Mastodon being one example of a new style of federated platform in which the app or hub for a network may be category 1 owing to the size of its user base but individual subdomains or networks sitting below it could fall under category 2 status. I am very happy to give a copy of the briefing to the Minister; it is a really well-written brief, and demonstrates entirely some of the issues we are talking about here.
I reassure the noble Lord, Lord Knight, that I think the amendment is very well drafted. It is really quite cunning in the way that it is done.
My Lords, I wonder whether I can make a brief intervention—I am sorry to do so after the noble Lord, Lord Clement-Jones, but I want to intervene before my noble friend the Minister stands up, unless the Labour Benches are about to speak.
I have been pondering this debate and have had a couple of thoughts. Listening to the noble Lord, Lord Clement-Jones, I am reminded of something which was always very much a guiding light for me when I chaired the Charity Commission, and therefore working in a regulatory space: regulation is never an end in itself; you regulate for a reason.
I was struck by the first debate we had on day one of Committee about the purpose of the Bill. If noble Lords recall, I said in that debate that, for me, the Bill at its heart was about enhancing the accountability of the platforms and the social media businesses. I felt that the contribution from my noble friend Lady Harding was incredibly important. What we are trying to do here is to use enforcement to drive culture change, and to force the organisations not to never think about profit but to move away from profit-making to focusing on child safety in the way in which they go about their work. That is really important when we start to consider the whole issue of enforcement.
It struck me at the start of this discussion that we have to be clear what our general approach and mindset is about this part of our economy that we are seeking to regulate. We have to be clear about the crimes we think are being committed or the offences that need to be dealt with. We need to make sure that Ofcom has the powers to tackle those offences and that it can do so in a way that meets Parliament’s and the public’s expectations of us having legislated to make things better.
I am really asking my noble friend the Minister, when he comes to respond on this, to give us a sense of clarity on the whole question of enforcement. At the moment, it is insufficiently clear. Even if we do not get that level of clarity today, when we come back later on and look at enforcement, it is really important that we know what we are trying to tackle here.
The amendments in this group address the Bill’s enforcement powers. I begin by assuring noble Lords that there is a strong package of enforcement powers in the Bill, which will promote compliance with the regulatory regime that it ushers in and ensure that providers are held to account. Ofcom will be given robust powers to use against companies that do not comply with their duties under the Bill; it will be able to impose a penalty and/or direct companies to take specific steps to come into compliance. When companies do not comply with such a direction, Ofcom will be able to issue penalties up to £18 million or 10% of qualifying global revenue, which can be considerably more. Ofcom will also be able to apply to the courts for business disruption measures, which we will touch on in a later group. These are court orders that require third parties to withdraw their services from, or block access to, the non-compliant regulated service.
Amendment 33 in the name of the noble Lord, Lord Stevenson of Balmacara, and moved by the noble Lord, Lord Knight, Amendments 182 and 218B in the name of my noble friend Lord Bethell, and government Amendments 218A, 284D, 284E and 284F all seek to widen senior management liability. It makes sense if I begin with the government amendments.
Senior managers can already be held criminally liable when they fail to ensure that their company provides Ofcom with the information that it needs to regulate. These amendments create a new offence of failure to comply with a requirement imposed by Ofcom in a confirmation decision, in relation to specific child safety duties. In such cases, the senior manager responsible will be liable and can face up to two years in prison, a fine or both.
My noble friend Lady Harding asked me to comment on whether that has to be conscious or deliberate. The means by which the new offence is linked to individuals or senior managers is achieved through the existing liability provisions in Clause 178. It does not have to be conscious or deliberate. This will ensure that a relevant senior manager could be held criminally liable for the offence of failing to comply with the steps in a confirmation decision relating to any linked duty, if such an offence was committed with the consent or connivance of the senior manager or was attributable to the neglect of the senior manager.
This approach is modelled on provisions in the Irish Online Safety and Media Regulation Act 2022. It ensures that services know when an action or omission risks criminal liability, while providing sufficient legal certainty to ensure that the offence can be prosecuted. The duties to which this offence will be linked are the child safety duties under Clause 11(3) and duties for pornographic content under Clause 72. This focuses the new offence on harms that are central to child safety, including self-harm content, eating disorder content and pornography. This offence fulfils the Government’s commitment in another place to bring forward an amendment in your Lordships’ House strengthening the Bill’s protections for children. I am grateful for the comments welcoming them.
Amendments 33 and 182 propose creating new offences for non-compliance with duties under the Bill. Attaching criminal liability directly to the duties would create uncertainty about the criminal action. Creating criminal offences that do not prescribe the required act or omission would give rise to real concerns about the quality of the criminal law. I am pleased to say that the Government’s amendments will achieve the core aims of Amendments 33 and 182 while providing sufficient legal certainty to ensure that managers can be prosecuted. I appreciate that my noble friend Lord Bethell has recognised the benefits of this approach in the drafting of his Amendment 218B.
I note that that amendment and my noble friend’s Amendment 182 link criminal liability with a wider range of duties, but it is important that this offence is a targeted one. As such, we have linked the offence with the specific duties which will most effectively focus efforts on child safety, and have intentionally targeted user-to-user sites, which have much greater control than search services over content and will therefore be best placed to prevent children accessing it. My noble friend asked about not linking senior management liability with child sexual exploitation and abuse content. The Bill already contains very strong powers to tackle child sexual exploitation and abuse content, including the power to require companies to use accredited technology to identify, take down and prevent users encountering such content.
Separately, the Bill imposes a requirement to report child sexual exploitation and abuse content to the National Crime Agency. Persons who falsify information in the course of their child sexual exploitation and abuse content reporting duties can be punished with up to two years in prison. This will tackle such exploitation and abuse at each stage, with strong preventive powers to ensure that such content is prevented from being encountered, that it is identified and removed, and that there are criminal sentences for falsifying information in the required reports to the National Crime Agency. At the same time, we are determined to ensure that this offence is as effective as possible in protecting children, while ensuring that it remains workable. We are willing to engage further with concerned parties to ensure that the provisions achieve these aims. I am very happy to discuss this further with my noble friend and other noble Lords if they wish to do so.
We are taking further steps to strengthen the Bill’s enforcement powers by conferring on Ofcom additional powers of seizure from premises, as per Section 50 of the Criminal Justice and Police Act 2001. Ofcom will be able to apply for a warrant to enter and inspect premises. Powers exercisable by warrant include the seizure of documentation and equipment. This amendment will, in certain circumstances, allow a person exercising this power to remove material from the premises, where it is not reasonably practicable to determine whether it is seizeable, in order to determine later whether they are entitled to seize it. Further, it allows a person to seize material where it is not reasonably practicable to separate it from seizeable material.
The amendments tabled by my fellow Northumbrian, the noble Lord, Lord Curry of Kirkharle, do three things. They require Ofcom to issue provisional notices of contravention if there are reasonable grounds for believing that a service or person is not complying with their duties; they provide that Ofcom can decide not to give an enforcement confirmation decision only if it is satisfied that systems and processes are in place to ensure that the service is in compliance; and they remove Ofcom’s discretion to determine how long specific enforcement steps should take. While I certainly accept the helpful spirit in which the noble Lord has tabled these amendments, I worry that they would undermine the discretion of the regulator to manage the enforcement process as it sees fit in each case. This would, in turn, undermine Ofcom’s ability to regulate in a proportionate way and could make Ofcom’s enforcement processes unnecessarily punitive and inflexible.
Instead, the Bill sees Ofcom acting proportionately in performing its regulatory functions, targeting action where it is needed and adjusting timeframes as necessary. Ofcom will have a statutory obligation to produce guidance on its approach to enforcing the new regime the Bill brings in, just as it does with other sectors that it regulates. Ofcom strives to take a consistent approach across these sectors and often combines guidance on its general principles of enforcement. In addition, as the Bill sets out, Ofcom may draw on guidelines it has produced under Section 392 of the Communications Act which relate to the amount of penalties. These examples of existing enforcement guidance illustrate Ofcom’s experience as a regulator in providing such enforcement guidance. Ofcom is well placed to produce clear and effective guidance to help businesses understand enforcement.
Amendment 219 in the name of the noble Lord, Lord Knight of Weymouth, seeks to impose liability on a provider where a company providing regulated services on its behalf does not comply with the duties in the Bill. The Bill sets out which services will need to comply with duties and makes it clear in Clause 198 that duties fall on the entity with control over the regulated service. Such entities are best placed to keep users safe online, as they can accurately assess risk and put in place systems and processes to minimise harm. At the same time, Ofcom can hold a parent and subsidiary company jointly responsible for the actions of a company if the parent company has sufficient control over the subsidiary. Under Amendment 219, the provider would be liable regardless of whether it has control over the service in question. That would impose an unreasonable burden on businesses and cause confusion regarding which companies are required to comply with the duties in the Bill.
The second group of amendments, in the name of my noble friend Lord Bethell, are Amendments 220A to 220C, which address the timing, nature and content of guidance that Ofcom must produce on its approach to enforcement. This guidance is important to ensure that companies are clear about Ofcom’s processes. The amendments would prescribe the details that Ofcom should contain in the guidance. To ensure the guidance is effective, Ofcom must retain the discretion to include the information which it considers relevant, drawing on its long experience as a regulator. As I say, we will come to debate later the business disruption measures for which Ofcom will be given the power to apply to the courts.
Finally, government Amendment 284B is a technical amendment providing extraterritorial application for the enforcement of civil proceedings in relation to a requirement on providers to publish details of enforcement actions. Together, the Bill’s suite of targeted, proportionate enforcement powers, further strengthened by the government amendments to which I have just spoken, will ensure that companies are held accountable. I hope that that brings a bit of clarity to noble Lords. I commend the amendment standing in my name and invite noble Lords not to press theirs.
My Lords, this discussion has been very useful. The noble Baroness, Lady Fox, as ever, made an interesting and thoughtful philosophical rumination. I hope that what she has just heard from the Minister around it applying to quite specific child safety duties gave her some comfort that this was not some kind of sweep-all measure that would result in lots of people being banged up.
The government amendments are tighter than those in the name of the noble Lord, Lord Bethell. In the end, that is the judgment that we all have to make between now and when we finish our consideration of the Bill. I agree with the noble Baroness, Lady Fox, that there are dangers attached to this: that platforms will choose just to exclude children altogether and that that may infringe on some of their rights. That is why we have to get this balance right. It ultimately has to be proportionate.
We have to develop trust in Ofcom to use its powers flexibly and proportionately. I have previously said some of the things that I think are needed in order to build our trust in Ofcom, in respect of transparency and parliamentary scrutiny and so on. I think that the noble Lord, Lord Curry, is right, from his experience, that the noble Lord, Lord Grade, and his colleagues will need to be quick, decisive and tough in using those powers proportionately in order to make these platforms, particularly the large, well-resourced and powerful ones, respond. Listening to the noble Baroness, Lady Harding, I reflected on when I was a senior executive of a largeish corporation a few years ago. I was in post when the anti-bribery and corruption Act, the Data Protection Act and the gender pay gap regulations all came in, and they made the senior executives—of the company I was in, anyway—sit up, take notice and change some behaviours. These things allow corporations to act according to the public interest and to adjust behaviour, but without it being proportionate.
I say to the Minister that the fact that, for example, under the Bribery Act you could be imprisoned on the basis of decisions made in your supply chain was significant. We had to be mindful of our whole supply chain to ensure that there was no corruption going on throughout, which is very different to the judgment the Minister is making on the supply chain in this system. I was grateful to the noble Lord, Lord Clement-Jones, for reminding us of the masterful Mastodon briefing; the way in which that technology is showing different ways in which things can be done to avoid aspects of regulation is another reason to think further about the spirit of Amendment 219 as we move to Report.
When we come to Part 10 and the enforcement section—or perhaps before then privately—it would be really useful, for my sake if for no one else’s, to clarify who the senior manager is. Does the senior manager have to be UK-based for these powers to be used? What happens with all the US companies and those based in parts of eastern Europe that do not have assets or people here, yet the harm extends to users here? How does senior manager liability work in that context? With that, I am happy to withdraw Amendment 33 and look forward to where we go next.
Amendment 33 withdrawn.
Amendment 33A not moved.
33B: After Clause 11, insert the following new Clause—
“Adult risk assessment duties
(1) This section sets out the duties about adult risk assessments which apply in relation to all Category 1 services.(2) A duty to carry out a suitable and sufficient assessment of the risk of an adult user encountering by means of the service content which is harmful to adults taking into account any relevant risk profile and to keep that assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question, or before making any significant change to any aspect of a service’s design or operation including changes to any user empowerment tools.”Member’s explanatory statement
This amendment requires Category 1 services to assess the risk of harm to adults arising from the operation of their services.
My Lords, as a former Deputy Leader of this House, if I were sitting on the Front Bench, I would have more gumption than to try to start a debate only 10 minutes before closing time. But I realise that the wheels grind on—perhaps things are no longer as flexible as they were in my day—so noble Lords will get my speech. The noble Lord, Lord Grade, who is at his post—it is very encouraging to see the chair of Ofcom listening to this debate—and I share a love of music hall. He will remember Eric Morecambe saying that one slot was like the last slot at the Glasgow Empire on a Friday night. That is how I feel now.
A number of references have been made to those who served on the Joint Committee and what an important factor it has been in their thinking. I have said on many occasions that one of the most fulfilling times of my parliamentary life was serving on the Joint Committee for the Communications Act 2003. The interesting thing was that we had no real idea of what was coming down the track as far as the internet was concerned, but we did set up Ofcom. At that time, a lot of the pundits and observers were saying, “Murdoch’s lawyers will have these government regulators for breakfast”. Well, they did not. Ofcom has turned into a regulator for which—at some stages this has slightly worried me—for almost any problem facing the Government, they say, “We’ll give it to Ofcom”. It has certainly proved that it can regulate across a vast area and with great skill. I have every confidence that the noble Lord, Lord Grade, will take that forward.
Perhaps it is to do with the generation I come from, but I do not have this fear of regulation or government intervention. In some ways, the story of my life is that of government intervention. If I am anybody’s child, I am Attlee’s child—not just because of the reforms of the Labour Party, but the reforms of the coalition Government, the Butler Education Act and the bringing in of the welfare state. So I am not afraid of government and Parliament taking responsibility in addressing real dangers.
In bringing forward this amendment, along with my colleague the noble Lord, Lord Lipsey, who cannot be here today, I am referring to legislation that is 20 years old. That is a warning to newcomers; it could be another 20 years before parliamentary time is found for a Bill of this complexity, so we want to be sure that we get its scope right.
The Minister said recently that the Bill is primarily a child safety Bill, but it did not start off that way. Five years ago, the online harms White Paper was seen as a pathfinder and trailblazer for broader legislation. Before we accept the argument that the Bill is now narrowed down to more specific terms, we should think about whether there are other areas that still need to be covered.
These amendments are in the same spirit as those in the names of the noble Baronesses, Lady Stowell, Lady Bull, and Lady Featherstone. We seek to reinstate an adult risk assessment duty because we fear that the change in title signals a reduction in scope and a retreat from the protections which earlier versions of the Bill intended to provide.
It was in this spirit, and to enable us to get ahead of the game, that in 2016 I proposed a Private Member’s Bill on this subject: the Online Harms Reduction Regulator (Report) Bill, which asked Ofcom to publish, in advance of the anticipated legislation, assessments of what action was needed to reduce harm to users and wider society from social networks. I think we can all agree that, if that work had been done in advance of the main legislation, such evidence would be very useful now.
I am well aware that there are those who, in the cause of some absolute concepts of freedom, believe that to seek to broaden the scope of the Bill takes us into the realms of the nanny state. But part of the social contract which enables us to survive in this increasingly complex world is that the ordinary citizen, who is busy struggling with the day-to-day challenges of normal life, does trust his Government and Parliament to keep an anticipatory weather eye on what is coming down the track and what dangers lie therein for the ordinary citizen.
When there have been game-changing advances in technology in the past, it has often taken a long time for societies to adapt and adjust. The noble Lord, Lord Moylan, referred to the invention of the printing press. That caused the Reformation, the Industrial Revolution and around 300 years of war, so we have to be careful how we handle these technological changes. Instagram was founded in 2010, and the iPhone 4 was released then too. One eminent social psychologist wrote:
“The arrival of smartphones rewired social life.”
It is not surprising that liberal democracies, with their essentially 18th-century construct of democracy, struggle to keep up.
The record of big tech in the last 20 years has, yes, been an amazing leap in access to information. However, that quantum leap has come with a social cost in almost every aspect of our lives. Nevertheless, I refuse to accept the premise that these technologies are too global and too powerful in their operation for them not to come within the reach of any single jurisdiction or the rule of law. I am more impressed by efforts by big tech companies to identify and deal with real harms than I am by threats to quit this or that jurisdiction if they do not get the light-touch regulation they want so as to be able to profit maximise.
We know by their actions that some companies and individuals simply do not care about their social responsibilities or the impact of what they sell and how they sell it on individuals and society as a whole. That is why the social contract in our liberal democracies means a central role for Parliament and government in bringing order and accountability into what would otherwise become a jungle. That is why, over the last 200 years, Parliament has protected its citizens from the bad behaviour of employers, banks, loan sharks, dodgy salesmen, insanitary food, danger at work and so on. In this new age, we know that companies large and small, British and foreign, can, through negligence, indifference or malice, drive innocent people into harmful situations. The risks that people face are complex and interlocking; they cannot be reduced to a simple list, as the Government seek to do in Clause 12.
When I sat on the pre-legislative committee in 2003, we could be forgiven for not fully anticipating the tsunami of change that the internet, the world wide web and the iPhone were about to bring to our societies. That legislation did, as I said, establish Ofcom with a responsibility to promote media literacy, which it has only belatedly begun to take seriously. We now have no excuse for inaction or for drawing up legislation so narrowly that it fails to deal with the wide risks that might befall adults in the synthetic world of social media.
We have tabled our amendments not because they will solve every problem or avert every danger but because they would be a step in the right direction and so make this a better Bill.
I am very grateful to the noble Lord, Lord McNally, for namechecking me and the amendments I have tabled with the support of the noble Baronesses, Lady Featherstone and Lady Bull, although I regret to inform him that they are not in this group. I understand where the confusion has come from. They were originally in this group, but as it developed I felt that my amendments were no longer in the right place. They are now in the freedom of expression group, which we will get to next week. What he has just said has helped, because the amendments I am bringing forward are not similar to the ones he has tabled. They have a very different purpose. I will not pre-empt the debate we will have when we get to freedom of expression, but I think it is only proper that I make that clear. I am very grateful to the noble Lord for the trail.
Debate on Amendment 33B adjourned.
House adjourned at 9.59 pm.