Committee (10th Day) (Continued)
Clause 146: OFCOM’s report about researchers’ access to information
Amendment 230
Moved by
230: Clause 146, page 128, line 35, leave out from “publish” to end of line 36 and insert “an interim report within the period of three months beginning with the day on which this section comes into force, and a final report within the period of two years beginning on the day on which the interim report is published.”
Member’s explanatory statement
This amendment seeks to accelerate the process relating to Ofcom’s report on researchers’ access to information. Instead of simply requiring a report within two years of Clause 146 being brought into force, this amendment would require an interim report within three months, with a final report to follow two years after that.
My Lords, my noble friend Lord Stevenson, who tabled this amendment, unfortunately cannot be with us today as he is off somewhere drinking sherry, I hope.
This is an important set of amendments about researchers’ access to data. As I have previously said to the Committee, we need to ensure that Ofcom has the opportunity to be as trusted as possible in doing its job, so that we can give it as much flexibility as we can, and so that it can deal with a rapidly changing environment. As I have also said on more than one occasion, in my mind, that trust is built by the independence of Ofcom from Secretary of State powers; the ongoing and post-legislative scrutiny of Parliament, which is not something that we can deal with in this Bill; and, finally, transparency—and this group of amendments goes to that very important issue.
The lead amendment in this group, Amendment 230 in my noble friend Lord Stevenson’s name, seeks to accelerate the process relating to Ofcom’s report on researchers’ access to information. Instead of simply requiring a report within two years of Clause 146 being brought into force, this amendment would require an interim report within three months with a final report to follow two years later. Although it is the lead amendment in the group, I do not think it is the more significant because, in the end, it does not do much about the fundamental problem that we want to deal with in this group, which is the need to do better than just having a report. We need to ensure that there really is access by independent reporters.
Amendments 233 and 234 are, I think, of more significance. These proposed new clauses would assist independent researchers in accessing information and data from providers of regulated services. Amendment 233 would allow Ofcom itself to appoint researchers to undertake a variety of research. Amendment 234 would require Ofcom to issue a code of practice on researchers’ access to data; again, this is important so that the practical and legal difficulties for both researchers and service providers can be overcome though negotiation and consultation by Ofcom. Amendment 233A from the noble Lord, Lord Allan, which I am sure he will speak to in a moment, is helpful in clarifying that no data protection breach would be incurred by allowing the research access.
In many ways, there is not a huge amount more to say. When Melanie Dawes, the head of Ofcom, appeared before the Joint Committee on 1 November 2021—all that time ago—she said that
“tightening up the requirement to work with external researchers would be a good thing in the Bill”.
It is therefore a disappointment that, when the Bill was finally published after the Joint Committee’s consideration of the draft, there was not something more significant and more weighty than just a report. That is what we are trying to address, particularly now that we see, as an example, that Twitter is charging more than £30,000 a month for researchers’ access. That is quite a substantial rate in order for researchers to be able to do their work in respect of that platform. Others are restricting or obscuring some of the information that people want to be able to see.
This is a vital set of measures if this Bill is to be effective. These amendments go a long way towards where we want to get to on this; for the reasons I have set out around ensuring that there is transparency, they are vital. We know from the work of Frances Haugen that the platforms themselves are doing this research. We need that out in the open, we need Ofcom to be able to see it through independent researchers and we need others to be able to see it so that Parliament and others can continue to hold these platforms to account. Given that the Minister is in such a positive mood, I look forward to his positive response.
My Lords, I must advise the Committee that if Amendment 230 is agreed to then I cannot call Amendment 231 because of pre-emption.
My Lords, we are reaching the end of our Committee debates, but I am pleased that we have some time to explore these important questions raised by the noble Lord, Lord Knight of Weymouth.
I have an academic friend who studies the internet. When asked to produce definitive answers about how the internet is impacting on politics, he politely suggests that it may be a little too soon to say, as the community is still trying to understand the full impact of television on politics. We are rightly impatient for more immediate answers to questions around how the services regulated by this Bill affect people. For that to happen, we need research to be carried out.
A significant amount of research is already being done within the companies themselves—both more formal research, often done in partnership with academics, and more quick-fix commercial analyses where the companies do their own studies of the data. These studies sometimes see the light of day through publication or quite often through leaks; as the noble Lord, Lord Knight, has referred to, it is not uncommon for employees to decide to put research into the public domain. However, I suggest that this is a very uneven and suboptimal way for us to get to grips with the impact on services. The public interest lies in there being a much more rigorous and independent body of research work, which, rightly, these amendments collectively seek to promote.
The key issues that we need to address head-on, if we are actively to promote more research, lie within the data protection area. That has motivated my Amendment 233A—I will explain the logic of it shortly—and is the reason why I strongly support Amendment 234.
A certain amount of research can be done without any access to personal data, bringing together aggregated statistics of what is happening on platforms, but the reality is that many of the most interesting research questions inevitably bring us into areas where data protection must be considered. For example, looking at how certain forms of content might radicalise people will involve looking at what individual users are producing and consuming and the relationships between them. There is no way of doing without it for most of the interesting questions around the harms we are looking at. If you want to know whether exposure to content A or content B led to a harm, there is no way to do that research without looking at the individual and the specifics.
There is a broad literature on how anonymisation and pseudonymisation techniques can be used to try to make those datasets a little safer. However, even if the data can be made safe from a technical point of view, that still leaves us with significant ethical questions about carrying out research on people who would not necessarily consent to it and may well disagree with the motivation behind the sorts of questions we may ask. We may want to see how misinformation affects people and steers them in a bad direction; that is our judgment, but the judgment of the people who use those services and consume that information may well be that they are entirely happy and there is no way on earth that they would consent to be studied by us for something that they perceive to be against their interests.
Those are real ethical questions that have to be asked by any researcher looking at this area. That is what we are trying to get to in the amendments—whether we can create an environment with that balance of equity between the individual, who would normally be required to give consent to any use of their data, and the public interest. We may determine that, for example, understanding vaccine misinformation is sufficiently important that we will override that individual’s normal right to choose whether to participate in the research programme.
My Amendment 233A is to Amendment 233, which rightly says that Ofcom may be in a position to say that, for example, vaccine misinformation is in the overriding public interest and we need research into it. If it decides to do that and the platforms transfer data to those independent researchers, because we have said in the amendment that they must, the last thing we want is for the platforms to feel that, if there is any problem further down the track, there will be comeback on them. That would be against the principle of natural justice, given that they have been instructed to hand the data over, and could also act as a barrier.
The fact that I am raising these concerns is because it is not far-fetched; however well-intentioned somebody is and however well they think they are doing data security, the reality of today’s world is that there are data breaches. Once you have given the data over, at some point some independent researcher is going to have a dataset compromised, and Ofcom itself may be in possession of data that is going to be compromised. Amendment 233A seeks to clarify that, in those circumstances, we are not going to go after the company.
People may be aware of a case involving my former employer and a company called Cambridge Analytica, and if you look at the fallout from that case, some of the decisions that were made pointed to the notion that the first party which originally collected the data can almost never say that they are no longer liable; any transfer to a third party carries their liability with it. That is reasonable in most cases; if, for commercial reasons, you are passing data on to somebody else, that is fine. However, in the circumstances where we have said the regulator is going to insist that they provide the data for a valid public purpose, I do not think we should be holding them liable if something goes wrong downstream—that is the rationale for Amendment 233A.
That brings me on to Amendment 234, which is a good way of trying to address the problem more generally. Sometimes there is an assumption that research is good and companies are bad: “Hand over the data and good stuff will happen”. There is a variable community of companies and a variable community of researchers, in terms of the confidence we can have in them to maintain data security and privacy. Having some kind of formal mechanism to approve researchers, and for researchers to sign up to, is extraordinarily helpful.
I refer noble Lords to the work done by the European Digital Media Observatory—this is one of those declarations of interests that is really a confession of expertise. I was on the board of the European Digital Media Observatory, for which I had no remuneration as it was done as a bit of penance having worked in the sector. As part of my penance, I felt I should be helping bodies that try to deal with the misinformation issue. The European Digital Media Observatory is a European Commission-sponsored body trying to deal with these exact questions, asking how we enable more research to happen, looking at misinformation in the context of the EU. It did some excellent work led by Dr Rebekah Tromble, an academic at George Washington University, who convened a working group which has come up with a code of practice that is going through the EU process. As long as we are not divergent from the general data protection regulation, it would have significant applicability here.
The real benefit of such an approach is that everyone knows what they are supposed to do, and we can objectively test whether or not they are doing it: the party that collected the data and handed it over; and the party that receives the data and does the research—everyone has very clear roles and responsibilities. By doing that, we unlock the flows, which is what we want to do collectively in these amendments: we want the data to flow from the regulated services to the independent researchers.
I am not arguing that this will necessarily solve all the problems, but it will certainly flush out whether, when services say they cannot provide data for research, that is a “cannot” or “will not”. Today, they can say they cannot for data protection legal reasons—I think with some justification. If we have the code of conduct in place as proposed in Amendment 234, and the researchers are approved researchers who have signed up to it and committed to doing all the right things, then it is much more reasonable for us to say “Platform, meet researcher; researcher, meet platform—you all know your responsibilities, and there are no legal barriers”, and to expect the data to move in a way that will meet those public interest obligations.
This an important set of amendments which we are coming to quite late in the day. They touch on some issues that are being dealt with elsewhere, and I hope this is one example where we will feel comfortable learning from the EU, which is a little bit ahead in terms of trying to deal with some of these questions, working within a framework which is still, from a data protection law point of view at least, a pretty consistent framework between us and them.
My Lords, Amendments 233 and 234 from the noble Lord, Lord Knight of Weymouth, were well motivated, so I will be brief. I just have a couple of queries.
First, we need to consider what the criteria are for who is considered worthy of the privileged status of receiving Ofcom approval as a researcher. We are discussing researchers as though they are totally reliable and trustworthy. We might even think that if they are academic researchers, they are bound to be. However, there was an interesting example earlier this week of confirmation bias leading to mistakes when King’s College had to issue a correction to its survey data that was used in the BBC’s “Mariana in Conspiracyland”. King’s College admitted that it had wildly overestimated the numbers of those reading conspiracy newspaper, The Light, and wildly overestimated the numbers of those attending what it dubbed conspiracy demonstrations. By the way, BBC Verify has so far failed to verify the mistake it repeated. I give this example not as a glib point but because we cannot just say that because researchers are accredited elsewhere they should just be allowed in. I also think that the requirement to give the researchers
“all such assistance as they may reasonably require to carry out their research”
sounds like a potentially very time-consuming and expensive effort.
The noble Lord, Lord Allan of Hallam, raised points around “can’t” or “won’t”, and whether this means researchers “must” or “should”, and who decides whether it is ethical that they “should” in all instances. There are ethical questions here that have been raised. Questions of privacy are not trivial. Studying individuals as specimens of “badthink” or “wrongthink” might appear in this Committee to be in the public interest but without the consent of people it can be quite damaging. We have to decide which questions fulfil the public interest so sufficiently that consent could be overridden in that way.
I do not think this is a slam-dunk, though it looks like a sensible point. I do not doubt that all of us want more research, and good research, and data we can use in arguments, whatever side we are on, but it does not mean we should just nod something through without at least pausing.
My Lords, I declare an interest as a trustee of the International Centre for the Study of Radicalisation at the War Studies department of King’s College London. That is somewhere that conducts research using data of the kind addressed in this group, so I have a particular interest in it.
We know from the kind of debates that the noble Lord, Lord Knight, referred to that it is widely accepted that independent researchers benefit hugely from access to relevant information from service providers to research online safety matters. That is why my Amendment 234, supported by the noble Lords, Lord Clement-Jones and Lord Knight, aims to introduce an unavoidable mandatory duty for regulated platforms to give access to that data to approved researchers.
As the noble Lord, Lord Knight, said, there are three ways in which this would be done. First, the timeframe for Ofcom’s report would be accelerated; secondly, proposed new Clause 147 would allow Ofcom to appoint the researchers; and, thirdly, proposed new Clause 148 would require Ofcom to write a code of practice on data access, setting up the fundamental principles for data access—a code which, by the way, should answer some of the concerns quite reasonably voiced by the noble Baroness, Lady Fox.
The internet is absolutely the most influential environment in our society today, but it is a complete black box, and we have practically no idea what is going on in some of the most important parts of it. That has a terrible impact on our ability to devise sensible policies and mitigate harm. Instead, we have a situation where the internet companies decide who accesses data, how much of it and for what purposes.
In answer to his point, I can tell the noble Lord, Lord Allan, who they give the data to—they give it to advertisers. I do not know if anyone has bought advertising on the internet, but it is quite a chilling experience. You can find out a hell of a lot about quite small groups of people if you are prepared to pay for the privilege of trying to reach them with one of your adverts: you can find out what they are doing in their bedrooms, what their mode of transport is to get to work, how old they are, how many children they have and so on. There is almost no limit to what you can find out about people if you are an advertiser and you are prepared to pay.
In fact, only the companies themselves can see the full picture of what goes on on the internet. That puts society and government at a massive disadvantage and makes policy-making virtually impossible. Noble Lords should be in no doubt that these companies deliberately withhold valuable information to protect their commercial interests. They obfuscate and confuse policymakers, and they protect their reputations from criticism about the harms they cause by withholding data. One notable outcome of that strategy is that it has taken years for us to be here today debating the Online Safety Bill, precisely because policy-making around the internet has been so difficult and challenging.
A few years ago, we were making some progress on this issue. I used to work with the Institute for Strategic Dialogue using CrowdTangle, a Facebook product. It made a big impact. We were working on a project on extremism, and having access to CrowdTangle revolutionised our understanding of how the networks of extremists that were emerging in British politics were coming together. However, since then, platforms have gone backwards a long way and narrowed their data-sharing. The noble Lord, Lord Knight, mentioned that CrowdTangle has essentially been closed down, and Twitter has basically stopped providing its free API for researchers—it charges for some access but even that is quite heavily restricted. These retrograde steps have severely hampered our ability to gather the most basic data from otherwise respectable and generally law-abiding companies. It has left us totally blind to what is happening on the rest of the internet—the bit beyond the nice bit; the Wild West bit.
Civil society plays a critical role in identifying harmful content and bad behaviour. Organisations such as the NSPCC, the CCDH, the ISD—which I mentioned—the Antisemitism Policy Trust and King’s College London, with which I have a connection, prove that their work can make a really big difference.
It is not as though other parts of our economy or society have the same approach. In fact, in most parts of our world there is a mixture of public, regulator and expert access to what is going on. Retailers, for instance, publish what is sold in our shops. Mobile phones, hospitals, banks, financial markets, the broadcast media—they all give access, both to the public and to their regulators, to a huge amount of data about what is going on. Once again, internet companies are claiming exceptional treatment—that has been a theme of debates on the Online Safety Bill—as if what happens online should, for some reason, be different from what happens in the rest of the world. That attitude is damaging the interests of our country, and it needs to be reversed. Does anyone think that the FSA, the Bank of England or the MHRA would accept this state of affairs in their regulated market? They absolutely would not.
Greater access to and availability of data and information about systems and processes would hugely improve our understanding of the online environment and thereby protect the innovation, progress and prosperity of the sector. We should not have to wait for Ofcom to be able to identify new issues and then appoint experts to look at them closely; there should be a broader effort to be in touch with what is going on with the internet. It is the nature of regulation that Ofcom will heavily rely on researchers and civil society to help enforce the Online Safety Bill, but this can be achieved only if researchers have sufficient access to data.
As the noble Lord, Lord Allan, pointed out, legislators elsewhere are making progress. The EU’s Digital Services Act gives a broad range of researchers access to data, including civil society and non-profit organisations dedicated to public interest research. The DSA sets out a framework for vetting and access procedures in detail, as the noble Baroness, Lady Fox, rightly pointed out, creating an explicit role for new independent supervisory authorities and digital services co-ordinators to manage that process.
Under Clause 146, Ofcom must produce a report exploring such access within two years of that section of the Bill coming into effect. That is too long. There is no obligation on the part of the regulator or service providers to take this further. No arguments have been put forward for this extended timeframe or relative uncertainty. In contrast, the arguments to speed up the process are extremely persuasive, and I invite my noble friend the Minister to address those.
To anticipate the Minister, I will just say that the skilled persons reports created in Clause 94 give Ofcom the helpful power to appoint a skilled person to provide a report, assisting Ofcom to identify and assess a failure by a regulated service. This will be an essential tool for Ofcom to access external expertise. However, it does not create a broader ecosystem of inspection, study and accountability that includes academics and civil society institutions with the capability and expertise to reach into the data and identify the harms of the platforms and their broader effects on society. That is why these measures need to be in the Bill. Given the Minister’s good mood today, I invite him to adopt wholesale these amendments in time for Report.
My Lords, I too want to support this group of amendments, particularly Amendment 234, and will make just a couple of brief points.
First, one of the important qualities of the online safety regime is transparency, and this really speaks to that point. It is beyond clear that we are going to need all hands on deck, and again, this speaks to that need. I passionately agree with the noble Baroness, Lady Fox, on this issue and ask, when does an independent researcher stop being independent? I have met quite a lot on my journey who suddenly find ways of contributing to the digital world other than their independent research. However, the route described here offers all the opportunities to put those balancing pieces in place.
Secondly, I am very much aware of the fear of the academics in our universities. I know that a number of them wrote to the Secretary of State last week saying that they were concerned that they would be left behind their colleagues in Europe. We do not want to put up barriers for academics in the UK. We want the UK to be at the forefront of governance of the digital world, this amendment speaks to that, and I see no reason for the Government to reject it.
Finally, I want to emphasise the importance of research. Revealing Reality did research for 5Rights called Pathways, in which it built avatars for real children and revealed the recommendation loops in action. We could see how children were being offered self-harm, suicide, extreme diets and livestream porn within moments of them arriving online. Frances Haugen has already been mentioned. She categorically proved what we have been asserting for years, namely that Instagram impacts negatively on teenage girls. As we put this regime in place, it is not adequate to rely on organisations that are willing to work in the grey areas of legality to get their research or on whistleblowers—on individual acts of courage—to make the world aware.
One of the conversations I remember happened nearly five years ago, when the then Secretary of State asked me what the most important thing about the Bill was. I said, “To bring a radical idea of transparency to the sector”. This amendment goes some way to doing just that.
My Lords, I, too, support Amendments 233 and 234, and Amendment 233A, from the noble Lord, Lord Allan. As the noble Baroness, Lady Kidron, said, it has been made clear in the past 10 days of Committee that there is a role for every part of society to play to make sure that we see the benefits of the digital world but also mitigate the potential harms. The role that researchers and academics can play in helping us understand how the digital world operates is critical—and that is going to get ever more so as we enter a world of large language models and AI. Access to data in order to understand how digital systems and processes work will become even more important—next week, not just in 10 years’ time.
My noble friend Lord Bethell quite rightly pointed out the parallels with other regulators, such as the MHRA and the Bank of England. A number of people are now comparing the way in which the MHRA and other medical regulators regulate the development of drugs with how we ought to think about the emergence of regulation for AI. This is a very good read-across: we need to set the rules of the road for researchers and ensure, as the noble Baroness, Lady Kidron, said—nailing it, as usual—that we have the most transparent system possible, enabling people to conduct their research in the light, not in the grey zone.
My Lords, as the noble Baroness, Lady Kidron, said, clearly, transparency is absolutely one of the crucial elements of the Bill. Indeed, it was another important aspect of the Joint Committee’s report. Like the noble Lord, Lord Knight—a fellow traveller on the committee—and many other noble Lords, I much prefer the reach of Amendments 233 and 234, tabled by the noble Lord, Lord Bethell, to Amendment 230, the lead amendment in this group.
We strongly support amendments that aim to introduce a duty for regulated platforms to enable access by approved independent researchers to information and data from regulated services, under certain conditions. Of course, there are arguments for speeding up the process under Clause 146, but this is really important because companies themselves currently decide who accesses data, how much of it and for what purposes. Only the companies can see the full picture, and the effect of this is that it has taken years to build a really solid case for this Online Safety Bill. Without a greater level of insight, enabling quality research and harm analysis, policy-making and regulatory innovation will not move forward.
I was very much taken by what the noble Baroness, Lady Harding, had to say about the future in terms of the speeding up of technological developments in AI, which inevitably will make the opening up of data, and research into it, of greater and greater importance. Of course, I also take extremely seriously my noble friend’s points about the need for data protection. We are very cognisant of the lessons of Cambridge Analytica, as he mentioned.
It is always worth reading the columns of the noble Lord, Lord Hague. He highlighted this issue last December, in the Times. He said:
“Social media companies should be required to make anonymised data available to third-party researchers to study the effect of their policies. Crucially, the algorithms that determine what you see—the news you are fed, the videos you are shown, the people you meet on a website—should not only be revealed to regulators but the choices made in crafting them should then be open to public scrutiny and debate”.
Those were very wise words. The status quo leaves transparency in the hands of big tech companies with a vested interest in opacity. The noble Lord, Lord Knight, mentioned Twitter announcing in February that it would cease allowing free research access to its application programming interface. It is on a whim that a billionaire owner can decide to deny access to researchers.
I much prefer Amendment 233, which would enable Ofcom to appoint an approved independent researcher. The Ofcom code of practice proposed in Amendment 234 would be issued for researchers and platforms, setting out the procedures for enabling access to data. I take the point made by the noble Baroness, Lady Fox, about who should be an independent accredited researcher, but I hope that that is exactly the kind of thing that a code of practice would deal with.
Just as a little contrast, Article 40 of the EU’s Digital Services Act gives access to data to a broad range of researchers—this has been mentioned previously—including civil society and non-profit organisations dedicated to public interest research. The DSA sets out in detail the framework for vetting and access procedures, creating an explicit role for new independent supervisory authorities. This is an example that we could easily follow.
The noble Lord, Lord Bethell, mentioned the whole question of skilled persons. Like him, I do not believe that this measure is adequate as a substitute for what is contained in Amendments 233 and 234. It will be a useful tool for Ofcom to access external expertise on a case-by-case basis but it will not provide for what might be described as a wider ecosystem of inspection and analysis.
The noble Lord also mentioned the fact that internet companies should not regard themselves as an exception. Independent scrutiny is a cornerstone of the pharmaceutical, car, oil, gas and finance industries. They are open to scrutiny from research; we should expect that for social media as well. Independent researchers are already given access in many other circumstances.
The case for these amendments has been made extremely well. I very much hope to see the Government, with the much more open approach that they are demonstrating today, accept the value of these amendments.
My Lords, the Government are supportive of improving data sharing and encouraging greater collaboration between companies and researchers, subject to the appropriate safeguards. However, the data that companies hold about users can, of course, be sensitive; as such, mandating access to data that are not publicly available would be a complex matter, as noble Lords noted in their contributions. The issue must be fully thought through to ensure that the risks have been considered appropriately. I am grateful for the consideration that the Committee has given this matter.
It is because of this complexity that we have given Ofcom the task of undertaking a report on researchers’ access to information. Ofcom will conduct an in-depth assessment of how researchers can currently access data. To the point raised by the noble Lord, Lord Knight, and my noble friend Lord Bethell, let me provide reassurance that Ofcom will assess the impact of platforms’ policies that restrict access to data in this report, including where companies charge for such access. The report will also cover the challenges that constrain access to data and how such challenges might be addressed. These insights will provide an evidence base for any guidance that Ofcom may issue to help improve data access for researchers in a safe and secure way.
Amendments 230 and 231 seek to require Ofcom to publish a report into researchers’ access to data more rapidly than within the currently proposed two years. I share noble Lords’ desire to develop the evidence base on this issue swiftly, but care must be taken to balance Ofcom’s timelines to ensure that it can deliver its key priorities in establishing the core parts of the regulatory framework that the Bill will bring in; for example, the illegal content and child safety duties. Implementing these duties must be the immediate priority for Ofcom to ensure that the Bill meets its objective of protecting people from harm. It is crucial that we do not divert attention away from these areas and that we allow Ofcom to carry out this work as soon as is practicable.
Further to this, considering the complex matter of researchers’ access to data will involve consultation with interested parties, such as the Information Commissioner’s Office, the Centre for Data Ethics and Innovation, UK Research and Innovation, representatives of regulated services and others—including some of those parties mentioned by noble Lords today—as set out in Clause 146(3). This is an extremely important issue that we need to get right. Ofcom must be given adequate time to consult as it sees necessary and undertake the appropriate research.
Amendments 233 to 234 would introduce a framework for approving independent researchers’ access to data and require Ofcom to issue a code of practice about it. Again, I am sympathetic to the intention, but it is vital that we understand and can fully mitigate the risks of mandating researchers’ access to data which are not publicly available before implementing such a framework. For example, there will be questions of privacy, the protection of personal data, user consent and the disclosure of commercially sensitive information. As the noble Baroness, Lady Fox, pointed out, there are challenges involved in defining who is an independent researcher, particularly as we seek to ensure that such a framework could not be exploited by bad actors.
Exempting service providers from data protection legislation for the purposes of facilitating researcher access to information poses a number of risks. Making derogations from data protection legislation, outside that legislation itself, risks undermining the coherence of the data protection framework. Ofcom’s report on researchers’ access to information will develop the evidence base on current access to information, the challenges associated with accessing it and how these challenges might be overcome.
As we have discussed previously in Committee, researchers will already have access to the publicly available transparency reports which major platforms will be required to publish annually. They may use these data to conduct research into online harms. Additionally, Ofcom will be required to undertake research into UK users’ opinions and experiences relating to regulated services and will have the power to undertake online safety research more broadly. This will help Ofcom and the wider public to understand the experiences of users online and will help to inform policy-making. Ofcom will have wide-ranging powers to require information from companies to support its research activities. Companies will not be able to withhold relevant online safety data when they are required of them by Ofcom. If they do, they will face enforcement measures, including significant fines and the liability of senior managers.
As noted by noble Lords, Ofcom will also have the power to require a report from a skilled person about a regulated service. Ofcom may appoint an independent researcher as the skilled person. This power may be used to assist Ofcom in identifying and assessing non-compliance and, where Ofcom considers there to be a risk of failure, to develop its understanding of the risk. Amendment 234 requires Ofcom to issue a code of practice on researchers’ access to data. I reassure the Committee that following the publication of its report, Ofcom will have the power to issue guidance to providers and researchers about how to facilitate data sharing for research in a safe and responsible way.
In summary, the regulatory framework’s focus on transparency will improve the data which are publicly available to researchers, while Ofcom’s report on the issue will enable the development of the evidence base before further action is considered. At the risk of disappointing noble Lords about the more open-minded attitudes today—
Before the Minister succeeds in disappointing us, can he clarify something for us? Once Ofcom has published the report, it has the power to issue guidance. What requirement is there for platforms to abide by that guidance? We want there to be some teeth at the end of all this. There is a concern that a report will be issued, followed by some guidance, but that nothing much else will happen.
It is guidance rather than direction, but it will be done openly and transparently. Users will be able to see the guidance which Ofcom has issued, to see whether companies have responded to it as they see fit and, through the rest of the framework of the Bill, be empowered to make their decisions about their experiences online. This being done openly and transparently, and informed by Ofcom’s research, will mean that everyone is better informed.
We are sympathetic to the amendment. It is complex, and this has been a useful debate—
I wonder whether the Minister has an answer to the academic community, who now see their European colleagues getting ahead through being able to access data through other legislation in other parts of the world. Also, we have a lot of faith in Ofcom, but it seems a mistake to let it be the only arbiter of what needs to be seen.
We are very aware that we are not the only jurisdiction looking at the important issues the Bill addresses. The Government and, I am sure, academic researchers will observe the implementation of the European Union’s Digital Services Act with interest, including the provisions about researchers’ access. We will carefully consider any implications of our own online safety regime. As noble Lords know, the Secretary of State will be required to undertake a review of the framework between two and five years after the Bill comes into force. We expect that to include an assessment of how the Bill’s existing transparency provisions facilitate researcher access.
I do not expect the Minister to have an answer to this today, but it will be useful to get this on the record as it is quite important. Can he let us know the Government’s thinking on the other piece of the equation? We are getting the platforms to disclose the data, and an important regulatory element is the research organisations that receive it. In the EU, that is being addressed with a code of conduct, which is a mechanism enabled by the general data protection regulation that has been approved by the European Data Protection Board and creates this legal framework. I am not aware of equivalent work having been done in the UK, but that is an essential element. We do not want to find that we have the teeth to persuade the companies to disclose the data, but not the other piece we need—probably overseen by the Information Commissioner’s Office rather than Ofcom—which is a mechanism for approving researchers to receive and then use the data.
We are watching with interest what is happening in other jurisdictions. If I can furnish the Committee with any information in the area the noble Lord mentions, I will certainly follow up in writing.
I have a question, in that case, in respect of the jurisdictions. Why should we have weaker powers for our regulator than others?
I do not think that we do. We are doing things differently. Of course, Ofcom will be looking at all these matters in its report, and I am sure that Parliament will have an ongoing interest in them. As jurisdictions around the world continue to grapple with these issues, I am sure that your Lordships’ House and Parliament more broadly will want to take note of those developments.
But surely, there is no backstop power. There is the review but there is no backstop which would come into effect on an Ofcom recommendation, is there?
We will know once Ofcom has completed its research and examination of these complex issues; we would not want to pre-judge its conclusions.
Again, that would require primary legislation.
With that, if there are no further questions, I invite the noble Lord to withdraw his amendment.
My Lords, this was a short but important debate with some interesting exchanges at the end. The noble Baroness, Lady Harding, mentioned the rapidly changing environment generated by generative AI. That points to the need for wider ecosystem-level research on an independent basis than we fear we might get as things stand, and certainly wider than the skilled persons we are already legislating for. The noble Lord, Lord Bethell, referred to the access that advertisers already have to insight. It seems a shame that we run the risk, as the noble Baroness, Lady Kidron, pointed out, of researchers in other jurisdictions having more privileged access than researchers in this country, and therefore becoming dependent on those researchers and whistleblowers to give us that wider view. We could proceed with a report and guidance as set out in the Bill but add in some reserved powers in order to take action if the report suggests that Ofcom might need and want that. The Minister may want to reflect on that, having listened to the debate. On that basis, I am happy to beg leave to withdraw the amendment.
Amendment 230 withdrawn.
Amendment 231 not moved.
Clause 146 agreed.
Amendments 232 to 234 not moved.
Clause 147 agreed.
Amendments 235 to 241 not moved.
Amendment 242
Moved by
242: Before Clause 148, insert the following new Clause—
“General procedure
(1) An appeal to the Upper Tribunal under section 148 or 149 must be commenced by sending a notice of appeal to the court.(2) The notice of appeal must set out the grounds of appeal in sufficient detail to indicate— (a) under which provision of this Act the appeal is to be brought;(b) to what extent (if any) the appellant contends that the decision against, or with respect to which, the appeal is brought was based on an error of fact or was wrong in law; and(c) to what extent (if any) the appellant is appealing against OFCOM’s exercise of its discretion in making the disputed decision.(3) The Upper Tribunal may give an appellant leave to amend the grounds of appeal identified in the notice of appeal.”Member’s explanatory statement
This amendment introduces additional procedural steps to be followed when the Upper Tribunal considers an appeal under Clauses 148 and 149.
My Lords, I am pleased to speak to Amendments 242, 243 and 245, which have been tabled in the name of my noble friend Lord Stevenson. The intention of this group is to probe what we consider to be an interesting if somewhat niche area, and I hope the Minister will take it in that spirit.
To give the Committee some idea of the background to this group, when Ofcom was originally set up and was mainly dealing with mobile and fixed telephony cartels, it had a somewhat torrid time, if I can describe it that way. Just about every decision it took was challenged in the courts on the so-called merits of the respective cases and on its powers, as the companies taking it to court had many resources they could call upon. That very much held up Ofcom’s progress and, of course, incurred major costs.
Prior to the Digital Economy Act, the worst of the experiences of this period were over, but Ofcom managed to persuade the Government that challenges made by companies in scope of Ofcom would in future be based on judicial review, rather than on merits. In other words, the test was whether Ofcom had acted within its powers and had not acted irrationally. An area of concern to a number of companies is who can challenge the regulator, even if it is acting within its powers, if it gets it wrong in the eyes of said companies. Perhaps the Minister will reflect on that.
This group of amendments is intended to provide better protections for service providers, their users and the wider public, alongside processes that should mean fewer delays and greater efficiency. The Competition Act 1998 permits appeals of Ofcom’s decisions to be made additionally on account of an error of fact, an error of law or an error of the exercise of its discretion.
The current wording of the Bill permits challenge only by way of judicial review of Ofcom’s decisions, which habitually leads to somewhat prolonged and drawn-out litigation in a process that, through judicial review, could take some nine to 18 months. That means an inability for a challenge by any party of the evidence or existence of a factual error. In light of their sensitive nature and the significant impact that detection orders may have on a large number of users of any particular service, these amendments would be something of a proportionate step to permit challenge on an extended basis. I look forward to hearing the response from the Minister, and I beg to move.
My Lords, I congratulate the noble Baroness on having elucidated this arcane set of amendments. Unfortunately, though, it makes me deeply suspicious when I see what the amendments seem to do. I am not entirely clear about whether we are returning to some kind of merits-based appeal. If so, since the main litigators are going to be the social media companies, it will operate for their benefit to reopen every single thing that they possibly can on the basis of the original evidence that was taken into account by Ofcom, as opposed to doing it on a JR basis. It makes me feel quite uncomfortable if it is for their benefit, because I suspect it is not going to be for the ordinary user who has been disadvantaged by a social media company. I hope our brand spanking new independent complaints system—which the Minister will no doubt assure us is well on the way—will deal with that, but this strikes me as going a little too far.
My Lords, I enter the fray with some trepidation. In a briefing, Carnegie, which we all love and respect, and which has been fantastic in the background in Committee days, shared some concerns. As I interpret its concerns, when Ofcom was created in 2003 its decisions could be appealed on their merits, as the noble Lord has just suggested, to the Competition Appeal Tribunal, and I believe that this was seen as a balancing measure against an untested regime. What followed was that the broad basis on which appeal was allowed led to Ofcom defending 10 appeals per year, which really frustrated its ability as a regulator to take timely decisions. It turned out that the appeals against Ofcom made up more than 80% of the workload of the Competition Appeal Tribunal, whose work was supposed to cover a whole gamut of matters. When there was a consultation in the fringes of the DEA, it was decided to restrict appeal to judicial review and appeal on process. I just want to make sure that we are not opening up a huge and unnecessary delaying tactic.
I thank all those who have spoken, and I very much appreciate the spirit in which the amendments were tabled. They propose changes to the standard of appeal, the standing to appeal and the appeals process itself. The Government are concerned that enabling a review of the full merits of cases, as proposed by Amendments 243 and 245, could prove burdensome for the courts and the regulator, since a full-merits approach, as we have been hearing, has been used by regulated services in other regulatory regimes to delay intervention, undermining the effectiveness of the enforcement process. With deep-pocketed services in scope, allowing for a full-merits review could incentivise speculative appeals, both undermining the integrity of the system and slowing the regulatory process.
While the Government are fully committed to making sure that the regulator is properly held to account, we feel that there is not a compelling case for replacing the decisions of an expert and well-resourced regulator with those of a tribunal. Ofcom will be better placed to undertake the complex analysis, including technical analysis, that informs regulatory decisions.
Amendment 245 would also limit standing and leave to appeal only to providers and those determined eligible entities to make super-complaints under Clause 150. This would significantly narrow the eligibility requirements for appeals. For appeals against Ofcom notices we assess that the broader, well-established standard in civil law of sufficient interest is more appropriate. Super-complaints fulfil a very different function from appeals. Unlike appeals, which will allow regulated services to challenge decisions of the regulator, super-complaints will allow organisations to advocate for users, including vulnerable groups and children, to ensure that systemic issues affecting UK users are brought to Ofcom’s attention. Given the entirely distinct purposes of these functions, it would be inappropriate to impose the eligibility requirements for super-complaints on the appeals system.
I am also concerned about the further proposal in Amendment 245 to allow the tribunal to replace Ofcom’s decision with its own. Currently, the Upper Tribunal is able to dismiss an appeal or quash Ofcom’s decision. Quashed decisions must be remitted to Ofcom for reconsideration, and the tribunal may give directions that it considers appropriate. Amendment 245 proposes instead allowing the Upper Tribunal to
“impose or revoke, or vary the amount of, a penalty … give such directions or take such other steps as OFCOM could itself have given or taken, or … make any other decision which OFCOM could itself have made”.
The concern is that this risks undermining Ofcom’s independence and discretion in applying its powers and issuing sanctions, and in challenging the regulator’s credibility and authority. It may also further incentivise well-resourced providers to appeal opportunistically, with a view to securing a more favourable outcome at a tribunal.
On that basis, I fear that the amendments tabled by the noble Lord would compromise the fundamental features of the current appeals provisions, without any significant benefits, and risk introducing a range of inadvertent consequences. We are confident that the Upper Tribunal’s judicial review process, currently set out in the Bill, provides a proportionate, effective means of appeal that avoids unnecessary expense and delays, while ensuring that the regulator’s decisions can be thoroughly scrutinised. It is for these reasons that I hope the noble Baroness will withdraw the amendment.
My Lords, I am grateful to the Minister. I will take that as a no—but a very well-considered no, for which I thank him. I say to the noble Lord, Lord Clement-Jones, that we certainly would not wish to make him feel uncomfortable at any time. I am grateful to him and the noble Baroness, Lady Kidron, for their contributions. As I said at the outset, this amendment was intended to probe the issue, which I feel we have done. I certainly would not want to open a can of worms—online, judicial or otherwise. Nor would I wish, as the Minister suggested, to undermine the work, efficiency and effectiveness of Ofcom. I am glad to have had the opportunity to present these amendments. I am grateful for the consideration of the Committee and the Minister, and with that I beg leave to withdraw.
Amendment 242 withdrawn.
Clause 148: Appeals against OFCOM decisions relating to the register under section 86
Amendment 243 not moved.
Clause 148 agreed.
Clause 149: Appeals against OFCOM notices
Amendments 244 to 250 not moved.
Clause 149 agreed.
Amendments 250A and 250B not moved.
Clauses 150 to 153 agreed.
Clause 154: Consultation and parliamentary procedure
Amendments 251 to 254 not moved.
Clause 154 agreed.
Clause 155: Directions about advisory committees
Amendments 255 and 256 not moved.
Clause 155 agreed.
Clause 156 agreed.
Clause 157: Secretary of State’s guidance
Amendments 257 to 259 not moved.
Clause 157 agreed.
Amendment 260 not moved.
Clause 158 agreed.
Clause 159: Review
Amendments 261 to 263 not moved.
Clause 159 agreed.
Amendment 264 not moved.
Clause 160: False communications offence
Amendment 264A
Moved by
264A: Clause 160, page 138, line 10, at end insert “including (but not necessarily) by making use of a stolen identity, credit card or national insurance number,”
Member’s explanatory statement
This amendment, together with the amendment to page 138, line 12 to which Lord Clement-Jones has added his name, seeks to probe the creation of a specific criminal offence of identity theft.
My Lords, even by the standards of this Bill, this is a pretty diverse group of amendments. I am leading the line with an amendment that does not necessarily fit with much of the rest of the group, except for Amendment 266, which the noble Baroness, Lady Buscombe, will be speaking to. I look forward to hearing her speak.
This amendment is designed to probe the creation of a new offence of identity theft in Clause 160. As I argued in my evidence to the consultation on digital identity and attributes in 2021, a new offence of identity theft is required. Under the Fraud Act 2006, the Identity Documents Act 2010, the Forgery and Counterfeiting Act 1981, the Computer Misuse Act 1990 and the Data Protection Act 2018 there are currently the offences of fraud using a false identity, document theft, forging an identity, unauthorised computer access and data protection offences respectively, but no specific crime of digital identity theft.
I was strongly influenced by the experience of the performer Bennett Arron, the award-winning writer and stand-up comedian, who had his digital ID stolen over 20 years ago and, as a result, became penniless and homeless. I think he was the first major victim of identity theft in the UK. Years later, he wrote a comedy show, surprisingly, about his experience. It is a disturbingly true yet funny account of what it is like to have your identity stolen. It was critically acclaimed—I know the Minister will appreciate this—at the Edinburgh festival and led to Bennett being asked to direct a documentary for Channel 4 Television. In the documentary, “How to Steal an Identity”, Bennett proved, through a series of stunts, how easy the crime of ID theft is to carry out. He also managed to steal the identity of the British Home Secretary—I am not sure which one; that needs further investigation. As he says, having something tangible stolen—your phone, a bike or a car—is awful and upsetting. However, they are replaceable, and their loss is unlikely to affect your whole life. Having your identity stolen is different: you will have great difficulty in restoring it, and the consequences can affect you for ever. He went on to describe his experiences.
Interestingly enough, the ICO has published guidance on identity theft. It says:
“Your identity is one of your most valuable assets. If your identity is stolen, you can lose money and may find it difficult to get loans, credit cards or a mortgage. Your name, address and date of birth provide … information”,
and an
“identity thief can use a number of methods to find out your personal information and will then use it to open bank accounts, take out credit cards”,
et cetera, in your name. The guidance goes on to talk about what signs you should look for and what you should do in the event of identity theft. However, effectively, it says that all you can do if some documents are stolen is tell the police; it does not tell you whether the police can do anything about it. All the guidance does is suggest that you report physical documents as having been stolen.
I have asked many questions of the Government, who, in response to the consultation on digital identity, proved stubbornly reluctant to commit to creating a new offence. It is not at all clear why. I am just sorry that the noble Baroness, Lady Morgan, is not here. She chaired the Select Committee and its terrific report Fighting Fraud: Breaking the Chain, which said:
“Identity theft is a fundamental component of fraud and is routinely used by fraudsters to steal money from legitimate individuals and organisations yet it remains out of scope of criminal offences”.
Its recommendation was:
“The Government should consult on the introduction of legislation to create a specific criminal offence of identity theft. Alternatively, the Sentencing Council should consider including identity theft as a serious aggravating factor in cases of fraud”.
In their response, the Government said:
“We agree that identity theft is a vector”—
that is a great word—
“used by fraudsters to commit fraud, but current legislation provides an effective avenue to prosecute those committing identity frauds”.
That is absolutely not the case. I look forward to what the Minister says about that, but I believe that there is a case for including an identity-theft offence both in the Bill and, later, when we come to the Data Protection and Digital Information Bill, where there will be an even stronger case for it to be included.
I will not be able to wind up on this group because I am speaking to this amendment, but I strongly support my noble friend’s Amendments 268AZB and 268AZC. I believe that the Law Commission’s intention was very much to create a high bar for the offence of encouragement of self-harm. It says:
“Our aim is only to criminalise the most serious examples of encouragement of self-harm”.
However, out there, a lot of the support organisations are worried about the broadness of the offence. They are concerned that it risks criminalising peer-support and harm-reduction resources, and that it may also criminalise content posted by people in distress who are sharing their own experiences of self-harm. That is why I support the amendment that my noble friend will speak to. I beg to move.
My Lords, I shall speak to Amendments 266 and 267, to which my noble and learned friend Lord Garnier, my noble friend Lord Leicester and the noble Lord, Lord Clement-Jones, have added their names. They are the final two amendments from a group of amendments that were also supported by the noble Lord, Lord Moore of Etchingham, and the noble Baroness, Lady Mallalieu.
The purpose of this Bill is to make the internet a safer place. The new offence of false communications is just one of the important means it seeks to use with the objective of making it an offence to harm people by telling lies online—and this is welcome. It is right that the Bill should focus on preventing harms to individuals. One of the most important guarantors that a person can have of good health and well-being is their freedom to pursue their livelihood unimpeded by illegitimate hostile action. Attacks on people’s livelihoods have the potential to wreak unimaginable harm on their mental and physical health, but these attacks are also among the easiest to perpetrate through the internet. My amendments seek to prevent such harms by protecting people who run, or work for, businesses that have been targeted with malicious fake reviews posted to online platforms, such as Google Maps or TripAdvisor. These platforms already fall within scope of this Bill in hosting user-generated content.
By referencing fake reviews, I am not referring to legitimate criticism, fair comment or even remarks about extraneous matters such as the owners’ pastimes or opinions, provided that the reviewer is honest about the nature of their relationship with the business. If someone wants to write a review of a business which they admit they have never patronised, and criticise it based on such factors, this would not be illegal, but it would very likely breach the platform’s terms of service and be removed. Review platforms are not the proper venue for such discussions; their role is to let people share opinions about a business’s products and services, but policing that is up to them.
The malicious fake reviews that I am referring to are those that are fundamentally dishonest. People with grudges to bear know that the platforms they use to attack their victims will remove any reviews that are clearly based on malice rather than a subjective assessment of quality. That is why they have come to adopt more insidious tactics. Without mentioning the real reason for their hostility towards a business and/or its staff, they purport to be customers who have had bad experiences. Of course, in almost every case, the reviewer has never so much as gone near the business. The review is therefore founded on lies.
This is not merely an abstract concern. Real people are being really harmed. Noble Lords will know that in earlier debates I used the prism of rural communities to amplify the objective of my amendments. Only yesterday, during Oral Questions in your Lordships’ House, there was an overwhelming collective consensus that we need to do more to protect the livelihoods of those working so hard in rural communities. My simple amendments would make a massive difference to their well-being.
The Countryside Alliance recently conducted a survey that found innumerable instances of ideologically motivated fake reviews targeted at rural businesses; these were often carried out by animal rights extremists and targeted businesses and their employees who sometimes participated in activities to which they objected, such as hosting shoots or serving meat. In April this year, the Telegraph reported on one case of a chef running a rural pub whose business was attacked with fake reviews by a vegan extremist who had verifiably never visited the pub, based initially on the man’s objection to him having posted on social media a picture of a roast chicken. The chef said these actions were making him fear for his livelihood as his business fought to recover from the pandemic. He is supporting my amendments.
Amendment 266 would therefore simply add the word “financial” to “physical” and “psychological” in the Bill’s definition of the types of harm that a message would need to cause for it to amount to an offence. This amendment is not an attempt to make the Bill into something it was not designed to be. It is merely an attempt to protect the physical and mental health of workers whose businesses are at risk of attack through malicious fake reviews. It may be that the victim of such an attack could argue that a fake review has caused them physical or psychological harm, as required under the Bill as currently drafted—indeed, it would likely do so. The reason for adding financial harm is to circumvent the need for victims to make that argument to the police, the police to the Crown Prosecution Service and then the prosecutors in front of the jury.
That links to Amendment 267, which would enlarge the definition of parties who may be harmed by a message for it to an amount to an offence. Under the Bill, a message must harm its intended, or reasonably foreseeable, recipient; however, it is vital to understand that a person need not receive the message to be harmed by it. In the case of fake reviews, the victim is harmed because the false information has been seen by others; he or she is not an intended recipient. The amendment would therefore include harms to the person or organisation to which the information—or, in reality, disinformation—contained within it relates.
My principal objective in bringing these amendments is not to create a stick with which to beat those who wish harm to others through malicious fake reviews; rather—call me old-fashioned—it is about deterrence. It is to deter this conduct by making it clear that it is not acceptable and would, if necessary, be pursued by police and through the courts under criminal law. It is about seeing to it that malicious fake reviews are not written and their harm is not caused.
I am aware that the Government have responded to constituents who have contacted their MPs in support of these amendments to say that they intend to act through the Competition and Markets Authority against businesses that pay third parties to write fake disparaging reviews of their competitors. I must stress to my noble friend the Minister, with respect, that this response misunderstands the issue. While there is a problem with businesses fraudulently reviewing their competitors to gain commercial advantage—and it is welcome that the Government plan to act on it—I am concerned with extreme activists and other people with ideological or personal axes to grind. These people are not engaged in any relevant business and are not seeking to promote a competitor by comparison. It is hard to see how any action by the Competition and Markets Authority could offer an effective remedy. The CMA exists to regulate businesses, not individual cranks. Further, this is not a matter of consumer law.
If the Government wish to propose some alternative means of addressing this issue besides my amendments, I and those who have added their names—and those who are supporters beyond your Lordships’ House—would be pleased to engage with Ministers between now and Report. In that regard though, I gently urge the Government to start any conversation from a position of understanding—really understanding—what the problem is. I fully appreciate that the purpose of this Bill is to protect individuals, and that is the key point of my amendments. My focus is upon those running and working in small businesses who are easy targets of this form of bullying and abuse. It is entirely in keeping with the spirit and purpose of the Bill to protect them.
Finally, I must be clear that the prism of what happens in our rural areas translates directly to everything urban across the UK. A practical difference is that people working in remote areas are often very isolated and find this intrusion into their life and livelihood so hard to cope with. We live in a pretty unpleasant world that is diminishing our love of life—that is why this Bill is so necessary.
My Lords, I wish to add to what my noble friend Lady Buscombe has just said, but I can do so a little more briefly, not least because she has made all the points that need to be made.
I would disagree with her on only one point, which is that she said—I am not sure that she wanted to be called old-fashioned, but she certainly wanted to have it explained to us—that the purpose of our amendment was to deter people from making malicious posts to the detriments of businesses and so forth. I think it is about more than deterrence, if I may say so. It is about fairness and justice.
It is very natural for a civilised, humane person to want to protect those who cannot protect themselves because of the anonymity of the perpetrator of the act. Over the last nearly 50 years, I have practised at the media Bar, including in cases based on the tort of malicious falsehood, trade libel or slander of goods. Essentially, my noble friend and I are trying to bring into the criminal law the torts that I have advised on and appeared in cases involving, so that the seriousness of the damage caused by the people who do these anonymous things can be visited by the weight of the state as the impartial prosecutor.
I say to my noble friend on the Front Bench that this is not just a complaint by those who like eating meat, those who earn a living through country pursuits or those who wish to expand their legal practices from the civil sphere to the criminal. It is a plea for the Government and Parliament to reach out and protect those who cannot help themselves.
Now, there will be evidential difficulties in getting hold of anonymous posters of malicious comments and reviews. It may be said that adding to the criminal law, as we would like to do through amending Clause 160, will interfere with people’s Article 10 rights. However, Article 10 does not permit you to make malicious and deliberately false remarks about others. Section 3 of the Defamation Act 2013, which provides for the defence of honest opinion, is not affected by the criminalisation of false and malicious posts about other people’s businesses or services.
We have a very simple remedy here, which goes with the grain of British fair play, the need for justice to be done and a Government who care for the people they govern, look after and make sure do not fall victim unwittingly and unknowingly—unknowingly in the sense that they do not know who is trying to hurt them, but they know what has happened to them because their profits, turnover and ability to feed their families have been grossly affected by these malicious, dishonest people. This amendment needs careful consideration and deserves wholehearted support across the House.
My Lords, as the noble Lord, Lord Clement-Jones, said, this is a very broad group, so I hope noble Lords will forgive me if I do not comment on every amendment in it. However, I have a great deal of sympathy for the case put forward by my noble friend Lady Buscombe and my noble and learned friend Lord Garnier. The addition of the word “financial” to Clause 160 is not only merited on the case made but is a practical and feasible thing to do in a way that the current inclusion of the phrase “non-trivial psychological” is not. After all, a financial loss can be measured and we know how it stands. I will also say that I have a great deal of sympathy with what the noble Lord, Lord Clement-Jones, said about his amendment. In so far as I understand them—I appreciate that they have not yet been spoken to—I am also sympathetic to the amendments in the names of the noble Baroness, Lady Kennedy of The Shaws, and the noble Lord, Lord Allan of Hallam.
I turn to my Amendment 265, which removes the word “psychological” from this clause. We have debated this already, in relation to other amendments, so I am going to be fairly brief about it. Probably through an oversight of mine, this amendment has wandered into the wrong group. I am going to say simply that it is still a very, very good idea and I hope that my noble friend, when he comes to reflect on your Lordships’ Committee as a whole, will take that into account and respond appropriately. Instead, I am going to focus my remarks on the two notices I have given about whether Clauses 160 and 161 should stand part of the Bill; Clause 161 is merely consequential on Clause 160, so the meat is whether Clause 160 should stand part of the Bill.
I was a curious child, and when I was learning the Ten Commandments—I am sorry to see the right reverend Prelate has left because I hoped to impress him with this—I was very curious as to why they were all sins, but some of them were crimes and others were not. I could not quite work out why this was; murder is a crime but lying is not a crime—and I am not sure that at that stage I understood what adultery was. In fact, lying can be a crime, of course, if you undertake deception with intent to defraud, and if you impersonate a policeman, you are lying and committing a crime, as I understand it—there are better-qualified noble Lords than me to comment on that. However, lying in general has never been a crime, until we get to this Bill, because for the first time this Bill makes lying in general—that is, the making of statements you know to be false—a crime. Admittedly, it is a crime dependent on the mode of transmission: it has to be online. It will not be a crime if I simply tell a lie to my noble and learned friend Lord Garnier, for example, but if I do it online, any form of statement which is not true, and I know not to be true, becomes a criminal act. This is really unprecedented and has a potentially chilling effect on free speech. It certainly seems to be right that, in your Lordships’ Committee, the Government should be called to explain what they think they are doing, because this is a very portentous matter.
The Bill states that a person commits the false communications offence if they send a message that they know to be false, if they intend the message to cause a degree of harm of a non-trivial psychological or physical character, and if they have no reasonable excuse for sending the message. Free speech requires that one should be allowed to make false statements, so this needs to be justified. The wording of the offence raises substantial practical issues. How is a court meant to judge what a person knows to be false? How is a committee of the House of Commons meant to judge, uncontroversially, what a person knows to be false at the time they say it? I say again: what is non-trivial psychological harm and what constitutes an excuse? None of these things is actually defined; please do not tell me they are going to be defined by Ofcom—I would not like to hear that. This can lead to astonishing inconsistency in the courts and the misapplication of criminal penalties against people who are expressing views as they might well be entitled to do.
Then there is the question of the audience, because the likely audience is not just the person to whom the false statement is directed but could be anybody who subsequently encounters the message. How on earth is one going to have any control over how that message travels through the byways and highways of the online world and be able to say that one had some sense of who it was going to reach and what non-trivial psychological harm it might cause when it reached them?
We are talking about this as if this criminal matter is going to be dealt with by the courts. What makes this whole clause even more disturbing is that in the vast majority of cases, these offences will never reach the courts, because there is going to be, inevitably, an interaction with the illegal content duties in the Bill. By definition, these statements will be illegal content, and the platforms have obligations under the Bill to remove and take down illegal content when they become aware of it. So, the platform is going to have to make some sort of decision about not only the truth of the statement but whether the person knows what the statement is, that the statement is false and what their intention is. Under the existing definition of illegal content, they will be required to remove anything they reasonably believe is likely to be false and to prevent it spreading further, because the consequences of it, in terms of the harm it might do, are incalculable by them at that point.
We are placing a huge power of censorship—and mandating it—on to the platforms, which is one of the things that some of us in this Committee have been very keen to resist. Just exploring those few points, I think my noble friend really has to explain what he thinks this clause is doing, how it is operable and what its consequences are going to be for free speech and censorship. As it stands, it seems to me unworkable and dangerous.
Does my noble friend agree with me that our courts are constantly looking into the state of mind of individuals to see whether they are lying? They look at what they have said, what they have done and what they know. They can draw an inference based on the evidence in front of them about whether the person is dishonest. This is the daily bread and butter of court. I appreciate the points he is making but, if I may say so, he needs to dial back slightly his apoplexy. Underlying this is a case to be made in justice to protect the innocent.
I did not say that it would be impossible for a court to do this; I said it was likely to lead to high levels of inconsistency. We are dealing with what is likely to be very specialist cases. You can imagine this in the context of people feeling non-trivially psychologically harmed by statements about gender, climate, veganism, and so forth. These are the things where you see this happening. The idea that there is going to be consistency across the courts in dealing with these issues is, I think, very unlikely. It will indeed have a chilling effect on people being able to express views that may be controversial but are still valid in an open society.
My Lords, I want to reflect on the comments that the noble Lord, Lord Moylan, has just put to us. I also have two amendments in the group; they are amendments to the government amendment, and I am looking to the Minister to indicate whether it is helpful for me to explain the rationale of my amendments now or to wait until he has introduced his. I will do them collectively.
First, the point the noble Lord, Lord Moylan, raised is really important. We have reached the end of our consideration of the Bill; we have spent a lot of time on a lot of different issues, but we have not spent very much time on these new criminal offences, and there may be other Members of your Lordships’ House who were also present when we discussed the Communications Act back in 2003, when I was a Member at the other end. At that point, we approved something called Section 127, which we were told was essentially a rollover of the dirty phone call legislation we had had previously, which had been in telecoms legislation for ever to prevent that deep-breathing phone call thing.
It went through almost on the nod, and then it turned out to become a very significant offence later. Noble Lords may be aware of the Twitter joke trial, the standout trial that people have followed, where an individual was prosecuted for saying something on Twitter which was originally taken very seriously as a bomb threat. Later, on appeal, the prosecution was found to be invalid. The debate around Section 127 and its usage has gone backwards and forwards.
There is a genuine question to be asked here about whether we will be coming back in a few years and finding that this offence has been used in ways that we were not expecting or intending. I think we all know what we are trying to get at: the person who very deliberately uses falsehoods to cause serious harm to others. That has been described by the noble Baroness, Lady Buscombe, and the noble and learned Lord, Lord Garnier. However, you can see how the offence could inadvertently capture a whole load of other things where we would either collectively agree that they should not be prosecuted or have quite different opinions about whether they should be prosecuted.
I used to sit in judgment on content at a platform, and people would often say to us, “Why are you allowing that content? It’s false”. We would say, quite rightly, that we had no terms of service that say you cannot lie or issue falsehoods on the platform. They would say, “Why not?”, and we would dig into it and say, “Let’s just deal with some falsehoods”. For example, “The earth is flat”—I think most of us would agree that that is false, but it is entirely harmless. We do not care; it is a lie we do not care about. What about, “My God is the only true God”? Well, that is an opinion; it is not a statement of fact—but we are getting into a zone that is more contested. As for, “Donald Trump won the election”, that is an absolute and outright lie to one group of people but a fundamental issue of political expression for another. You very quickly run into these hotly contested areas. This is going to be a real challenge.
We have often talked in this debate about how we are handing Ofcom a really difficult job across all the measures in the Bill. In this case, I think we are handing prosecutors a really difficult job of having to determine when they should or should not use the new offence we are giving them. I think it will be contested, and we may well want to come back later and look at whether it is being used appropriately. I am sure the noble Baroness, Lady Fox, will have something to say about this.
If we do anything, we should learn from previous experiences. I think everyone would agree that Section 127 of the Communications Act 2003, which has been in place for 20 years, has not been an easy ride. We have moved our view on that around considerably. One good thing is that these new offences replace it—we are replacing this thing we rushed through then with a different version—but we need to test very carefully whether we have got it right.
Moving away from that offence to the new offence of the encouragement of self-harm, which the Minister will introduce shortly, I have two amendments which are quite different from each other, and I want to explain each of them. The first, Amendment 268AZC, seeks to test the threshold for prosecutions under this offence—so, again, it is about when we should or should not prosecute. It follows concerns raised by my noble friend Lord Clement-Jones. Legitimate concerns have been raised by a coalition of about 130 different individuals and organisations supporting survivors of self-harm. The question, really, is what the Government’s intention is. I hope the Minister can put something on the record which will be helpful later in terms of the Government’s intention for where the threshold should lie.
We can see clear instances where somebody maliciously and deliberately encourages self-harm. There are other issues around the way in which systems encourage self-harm, which I think are tackled in other parts of the Bill, but here we are talking about an individual carrying out an action which may be prosecuted. We can see those people, but there is then a spectrum: people who support victims of self-harm, people who provide educational and support materials—some of which might include quite graphic descriptions—people who run online fora, and indeed the individuals themselves who are posting self-harm content and documenting what they have done to themselves.
What I am looking for, and I think other noble Lords are looking for, is an assurance that it is not our intention to capture those people within this new offence. When the Minister outlines the offence, I would appreciate an assurance about why those fears will not be realised. We have suggested in the amendment to have a bar involving the Director of Public Prosecutions. I am sure the Minister will explain why that is not the right gating mechanism—but if not that, what is going to ensure that people in distress are not brought into the scope of this offence?
It would be very helpful to know what discussions the Minister is having with the Ministry of Justice, particularly about working with relevant organisations on the detail of what is being shared. Again, we can talk about it in the abstract; I found that specific cases can be helpful. I hope that there are people, either in his department or in the Ministry of Justice, who are talking to organisations, looking at the fora and at the kind of content that people in distress post, applying themselves and saying, “Yes, we are confident that we will not end up prosecuting that individual or that organisation”, or, if it is likely that they will be prosecuted, “We need to have a longer discussion about that if that is not the intention”.
Moving backwards through the letters, Amendment 268AZB takes us back to a question raised earlier today by the noble Baroness, Lady Kidron, but also on the first day in Committee, way back when—I am starting to feel nostalgic. She proposed an amendment to broaden the scope of this regulation to all online services, whether or not they are in the regulated user-to-user and search bucket for children’s protection purposes. I argued against that. I continue to view this legislation as appropriately targeted, and I worry about broadening the scope—but here I have a lot of sympathy. The noble Baroness raised a specific scenario, which again is a real one, of an individual outside the UK jurisdiction—let us imagine that they are in the US, because they will have first amendment protections. They run a blog, so it is not user to user, which is targeting people in the UK and saying, “You should harm yourself. You should commit suicide”. That individual is doing nothing wrong in their terms. They are now, under the terms of the Bill, committing a criminal offence in the UK, but there is virtually no prospect that they will ever be prosecuted unless they come to the UK. The amendment is seeking to tease out what happens in those scenarios.
I do not expect the Minister to accept my amendment as drafted—I thought it was cheeky even as I was drafting it. It says that for the purposes that we want, we will apply a whole bunch of the Online Safety Bill measures—the disruption, the blocking, the business disruption measures—to websites that promote this kind of content, even though they are not otherwise regulated. I am sure that the Minister has very good legal arguments in his notes as to why that would not work, but I hope he will tell us what else the Government can do. I do not think that people will find it acceptable if we go to all the trouble of passing this legislation but there is a category of online activity—which we know is there; it is real—that we can do nothing about. They are breaking our criminal law; we have taken all this time to construct a new form of criminal law, and yet they are sitting there beyond our reach, ignoring it, and we can do nothing. I hope the Minister can offer some suggestions as to what we might do.
I see that the noble Baroness, Lady Finlay of Llandaff, is in her place; she asked me to raise questions around the extent to which we have been using the powers we have today, but I will not as I think she will do so herself.
The core question that motivates the second amendment is that of what the Government think we might do if there are individuals who are not user-to-user or search services and are therefore outside the regulated bucket, who persistently and deliberately breach this new offence of encouragement to self-harm and yet are outside the UK jurisdiction. I hope the Minister agrees that something should be done about that scenario, and I look forward to hearing his suggestion about what may be done.
My Lords, I have two amendments in this grouping. I am afraid that I did not have time to get others to put their names to them, but I hope that they will find some support in this Committee.
For almost the whole of 2021, I chaired an inquiry in Scotland into misogyny. It was about the fact that many complaints were being made to the devolved Government in Scotland about women’s experiences not just of online harassment but of the disinhibition that the internet and social media have given people to be abusive online now also visiting the public square. Many people described the ways in which they are publicly harassed. I know that concerns people in this House too.
When I came to the Bill, I was concerned about something that became part of the evidence we heard. It is no different down here from in Scotland. As we know, many women—I say women, but men receive harassment online too—are sent really vicious, vile things. We all know of parliamentarians and journalists who have received them, their lives made a misery by threats to rape and kill and people saying, “Go and kill yourself”. There are also threats of disfigurement—“Somebody should take that smile off your face”—and suggestions that an acid attack be carried out on someone.
In hearing that evidence, it was interesting that some of the forms of threat are not direct in the way that criminal law normally works; they are indirect. They are not saying, “I’m going to come and rape you”. Sometimes they say that, but a lot of the time they say, “Somebody should rape you”; “You should be raped”; “You deserve to be raped”; “You should be dead”; “Somebody should take you out”; “You should be disfigured”; “Somebody should take that smile off your face, and a bit of acid will do it”. They are not saying, “I’m going to come and do it”, in which case the police go round and, if the person is identifiable, make an arrest—as happened with Joanna Cherry, the Scottish MP, for example, who had a direct threat of rape, and the person was ultimately charged under the Communications Act.
Our review of the kinds of threat taking place showed that it was increasingly this indirect form of threat, which has a hugely chilling effect on women. It creates fear and life changes, because women think that some follower of this person might come and do what is suggested and throw acid at them as they are coming out of their house, and they start rearranging their lives because of it—because they live in constant anxiety about it. It was shocking to hear the extent to which this is going on.
In the course of the past year, we have all become much more familiar with Andrew Tate. What happens with these things is that, because of the nature of social media and people having big followings, you get the pile-on: an expression with which I was not that familiar in the past but now understand only too well. The pile-on is where, algorithmically, many different commentaries are brought together and suddenly the recipient receives not just one, or five, but thousands of negative and nasty threats and comments. Of course, as a public person in Parliament, or a councillor, you are expected to open up your social media, because that is how people will get in touch with you or comment on the things you are doing, but then you receive thousands of these things. This affects journalists, Members of Parliament, councillors and the leaders of campaigns. For example, it was interesting to hear that people involved in the Covid matters received threats. It affects both men and women, but the sexual nature of the threats to women is horrifying.
The Andrew Tate thing is interesting because only yesterday I saw in the newspapers that part of the charging in Romania is about the way in which, because of his enormous following, and his encouragement of violence towards women, he is being charged, among many other things that are directly about violence to and the rape of women, for his incitement to these behaviours in many of his young male followers. In the report of the inquiry that I conducted, there are a number of recommendations around offences of that sort.
To specifically deal with this business of online threats, my amendments seek to address their indirect nature—not the ones that say, “I’m going to do it”, but the encouragement to others to do it or to create the fear that it will happen—and to look at how the criminal law addresses that.
I started out with the much lesser suggestion of inserting that the threat be either by the person who is sending the message or another individual, but the more I reflected on it the more it did not seem to me that that went far enough to deal with what one was seeking to address here. This is why I have an alternative, Amendment 267AB.
Noble Lords will see that it says:
“A person commits an offence if they issue a communication concerning death”.
I have written “concerning death” rather than “a threat to kill” because the former can include someone saying, “You should kill yourself”, “You should commit suicide” or “You should harm yourself”. The amendment also refers to “assault (sexual or otherwise)”, which could include all manner of sexual matters but also self-harming and “disfigurement”. I was shocked at the extent to which disfigurement is suggested in these kinds of abusive texts to all manner of people—even campaigners. Even a woman who campaigns on air pollution after the death of her child from a fatal asthmatic attack receives the most horrible threats and abuse online. When people do this, they know the impact that it is going to have. They do it, as my amendment says,
“knowing it will cause alarm or distress to a specific person or specific people”,
rather than making generalised threats to the world.
As the Minister will know, I wrote to him wondering whether his team might put their great legal minds to this because we have to find a way of addressing the fact that people are encouraging others to make threats. We have to look at the effect that this has on the recipients, who are often women in public life in one way or another; the way in which it affects our polity and women’s participation, not just in public life but in politics and civil society generally; and the way in which it deters women from living their lives freely and equally with menfolk.
I hope that the Committee will think on that and that the Minister can come back to me with some positive things, even if he does not accept the particular formulation that I sought to devise. It may be that a different formulation could be sought, perhaps to include that it is done “recklessly”. I am prepared to consider its impact on people, but I think that it is done with knowledge of the impact that it will have, and where it is foreseeable that there will be an impact.
I urge the Committee to consider these matters. As I just heard one of my colleagues in the Committee suggest, this is a moment to seize. You can be sure that we cannot encapsulate everything, but we should be trying to cover as much as possible and the horrors of what is now happening on social media.
My Lords, I will address my remarks to government Amendment 268AZA and its consequential amendments. I rather hope that we will get some reassurance from the Minister on these amendments, about which I wrote to him just before the debate. I hope that that was helpful; it was meant to be constructive. I also had a helpful discussion with the noble Lord, Lord Allan.
As has already been said, the real question relates to the threshold and the point at which this measure will clock in. I am glad that the Government have recognised the importance of the dangers of encouraging or assisting serious self-harm. I am also grateful for the way in which they have defined it in the amendment, relating to it grievous bodily harm and severe injury. The amendment says that this also
“includes successive acts of self-harm which cumulatively reach that threshold”.
That is important; it means, rather than just one act, a series of them.
However, I have a question about subsection (10), which states that:
“A provider of an internet service by means of which a communication is sent, transmitted or published is not to be regarded as a person who sends, transmits or publishes it”.
We know from bereaved parents that algorithms have been set up which relay this ghastly, horrible and inciteful material that encourages and instructs. That is completely different from those organisations that are trying to provide support.
I am grateful to Samaritans for all its help with my Private Member’s Bill, and for the briefing that it provided in relation to this amendment. As it points out, over 5,500 people in England and Wales took their own lives in 2021 and self-harm is
“a strong risk factor for future suicide”.
Interestingly, two-thirds of those taking part in a Samaritans research project said that
“online forums and advice were helpful to them”.
It is important that there is clarity around providing support and not encouraging and goading people into activity which makes their self-harming worse and drags them down to eventually ending their own lives. Three-quarters of people who took part in that Samaritans research said that they had
“harmed themselves more severely after viewing self-harm content online”.
It is difficult to know exactly where this offence sits and whether it is sufficiently narrowly drawn.
I am grateful to the Minister for arranging for me to meet the Bill team to discuss this amendment. When I asked how it was going to work, I was somewhat concerned because, as far as I understand it, the mechanism is based on the Suicide Act, as amended, which talks about the offence of encouraging or assisting suicide. The problem as I see it is that, as far as I am aware, there has not been a string of prosecutions following the suicide of many young people. We have met their families and they have been absolutely clear about how their dead child or sibling—whether a child or a young adult—was goaded, pushed and prompted. I recently had experience outside of a similar situation, which fortunately did not result in a death.
The noble Lord, Lord Allan, has already addressed some of the issues around this, and I would not want the amendment not to be there because we must address this problem. However, if we are to have an offence here, with a threshold that the Government have tried to define, we must understand why, if assisting and encouraging suicide on the internet is already a criminal offence, nothing has happened and there have been no prosecutions.
Why is subsection (10) in there? It seems to negate the whole problem of forwarding on through dangerous algorithms content which is harmful. We know that a lot of the people who are mounting this are not in the UK, and therefore will be difficult to catch. It is the onward forwarding through algorithms that increases the volume of messaging to the vulnerable person and drives them further into the downward spiral that they find themselves in—which is perhaps why they originally went to the internet.
I look forward to hearing the Government’s response, and to hearing how this will work.
My Lords, this group relates to communications offences. I will speak in support of Amendment 265, tabled by the noble Lord, Lord Moylan, and in support of his opposition to Clause 160 standing part of the Bill. I also have concerns about Amendments 267AA and 267AB, in the name of the noble Baroness, Lady Kennedy. Having heard her explanation, perhaps she can come back and give clarification regarding some of my concerns.
On Clause 160 and the false communications offence, unlike the noble Lord, Lord Moylan, I want to focus on psychological harm and the challenge this poses for freedom of expression. I know we have debated it before but, in the context of the criminal law, it matters in a different way. It is worth us dwelling on at least some aspects of this.
The offence refers to what is described as causing
“non-trivial psychological or physical harm to a likely audience”.
As I understand it—maybe I want some clarity here—it is not necessary for the person sending the message to have intended to cause harm, yet there is a maximum sentence of 51 weeks in prison, a fine, or both. We need to have the context of a huge cultural shift when we consider the nature of the harm we are talking about.
J.S. Mill’s harm principle has now been expanded, as previously discussed, to include traumatic harm caused by words. Speakers are regularly no-platformed for ideas that we are told cause psychological harm, at universities and more broadly as part of the whole cancel culture discussion. Over the last decade, harm and safety have come no longer to refer just to physical safety but have been conflated. Historically, we understood the distinction between physical threats and violence as distinct from speech, however aggressive or incendiary that speech was; we did not say that speech was the same as or interchangeable with bullets or knives or violence—and now we do. I want us to at least pause here.
What counts as psychological harm is not a settled question. The worry is that we have an inability to ascertain objectively what psychological harm has occurred. This will inevitably lead to endless interpretation controversies and/or subjective claims-making, at least some of which could be in bad faith. There is no median with respect to how humans view or experience controversial content. There are wildly divergent sensibilities about what is psychologically harmful. The social media lawyer Graham Smith made a really good point when he said that speech is not a physical risk,
“a tripping hazard … a projecting nail … that will foreseeably cause injury … Speech is nuanced, subjectively perceived and capable of being reacted to in as many different ways as there are people.”
That is true.
We have seen an example of the potential disputes over what creates psychological harm in a case in the public realm over the past week. The former Culture Secretary, Nadine Dorries, who indeed oversaw much of this Bill in the other place, had her bullying claims against the SNP’s John Nicolson MP overturned by the standards watchdog. Her complaints had previously been upheld by the standards commissioner. John Nicolson tweeted, liked and retweet offensive and disparaging material about Ms Dorries 168 times over 24 hours—which, as they say, is a bit OTT. He “liked” tweets describing Ms Dorries as grotesque, a “vacuous goon” and much worse. It was no doubt very unpleasant for her and certainly a personalised pile-on—the kind of thing the noble Baroness, Lady Kennedy, just talked about—and Ms Dorries would say it was psychologically harmful. But her complaint was overturned by new evidence that led to the bullying claim being turned down. What was this evidence? Ms Dorries herself was a frequent and aggressive tweeter. So, somebody is a recipient of something they say causes them psychological harm, and it has now been said that it does not matter because they are the kind of person who causes psychological harm to other people. My concern about turning this into a criminal offence is that the courts will be full of those kinds of arguments, which I do not think we want.
One problem I have is that the Bill does not give any explicit protection in the public interest or for the purposes of debate around the use of such harm, even if, as has been indicated, false allegations are being made. People use hyperbole and exaggeration in political argument. Many of the big political questions of the day are not agreed on, and people accuse each other of lying all the time when it comes to anything from Brexit to gender. I am worried that inadvertently—I do not think anyone is trying to do this—Clause 160 will institutionalise and bake into primary legislation the core of cancel culture and lead to a more toxic climate than any of us would want.
I have some reservations about Amendment 267AB. I recognise and am full of admiration for the intention of the noble Baroness, Lady Kennedy, but the wording seems to make it an offence to issue
“a communication concerning death, rape, assault (sexual or otherwise) or disfigurement, knowing it will cause alarm or distress”.
I query how we would prove that someone knows it will cause offence, an issue that was slightly danced around. Also, a lot of content such as news websites, podcasts and various other communications could be said purposefully to cause alarm, offence or distress, perhaps because the intention is to shock people into realising what a war or a famine is like, or into understanding the dangers of groomers or suicide sites—the kind of things we have been discussing. During lockdown, the nudge unit explicitly issued communications about potential death that caused a great deal of alarm and distress. It had a public interest defence, which was that it was important that people were frightened into complying with the rules of lockdown, whatever one thinks of them. I do not see how that will not be caught up in this.
Amendment 267AA extends the offence to include encouraging someone else to commit harm. I understand that this is an attempt to deal with indirect misogynistic abuse that is not quite incitement but on the other hand seems to be A encouraging B—an indirect threat. I worry that if A encourages another person and that person does something as a consequence, it will lead to a “he told me to do it” defence and an abdication of responsibility. I have that qualm about it. The Member’s explanatory statement makes things even more difficult, using the phrase,
“if an individual sends a message which potentially encourages other individuals to carry out a harmful act”.
That is going to be wide open to abuse by all sorts of bad-faith complainants.
I say all this as someone who is regularly piled on. I noticed when I started to speak that the noble Baroness, Lady Stuart, is in her place. I remember my shock and horror when I saw the abuse she got some seven years ago and subsequently—really vicious, vile, horrible abuse for her political stance. Such abuse often takes a very sexualised form if you are a woman. So I can say from my lived experience that I know what it feels like to be on the receiving end of vile, horrible, misogynistic pile-ons, and so do Joanna Cherry, Rosie Duffield and a lot of people involved in contentious political issues. We need to make politics more civil by having the arguments and the debates and not mischaracterising, delegitimising or demonising people. I am just not sure that a criminal intervention here is going to help. I think it might make matters worse.
My Lords, as the noble Lord, Lord Clement-Jones, said in introducing this group some time ago, it is very diverse. I shall comment on two aspects of the amendments in this group. I entirely associate myself with the remarks of the noble Lord, Lord Allan, who really nailed the problems with Amendment 266, and I very much support the amendments in the name of the noble Baroness, Lady Kennedy of The Shaws; I would have signed them if I had caught up with them.
The noble Baroness, Lady Fox, talked about causing alarm and distress. I can draw on my own experience here, thinking about when someone randomly starts to post you pictures of crossbows. I think about what used to happen when I was a journalist in Bangkok, when various people used to get hand grenades posted into their letterbox. That was not actively dangerous—the pin was not pulled; it was still held down—but it was clearly a threat, and the same thing happens on social media.
This is something of which I have long experience. In 2005, when I was the founder of the feminist blog Carnival of Feminists, I saw the kinds of messages that the noble Baronesses have referred to, which in the days before social media used to be posted as comments on people’s blogs. You can still find the blog out there—it ran from 2005 to 2009—but many of its links to other blogs will be dead because they were often run by young women, often young women of colour, who were driven to pull down their blogs and sometimes were driven off the internet entirely by threatening, fearsome messages of the type that the noble Baroness, Lady Kennedy, referred to. We can argue about the drafting here—I will not have any opinion on that in detail—but something that addresses that issue is really important.
Secondly, we have not yet heard the Government’s introductions to Amendment 268AZA, but the noble Lord, Lord Clement-Jones, provided us with the information that it is an amendment to create the offence of encouraging or assisting self-harm. I express support for the general tenor of that, but I want to make one specific point: so far as I can see, the amendment does not have any defence or carve-out for harm-reduction messages, which may be necessary.
To set the context here, figures from the Royal College of Psychiatrists say that about one in 10 young people self-harm at some stage in their youth, and the RCP says those figures are probably an underestimate because they are based on figures where medical professionals actually see them so the number is probably significantly higher than that. An article in the Journal of Psychiatric and Mental Health Nursing from 2018 entitled “Self-cutting and harm reduction” is focused on in-patient settings, but the arguments in it are important in setting the general tone. It says that
“harm reduction in all its guises starts from the premise that the end goal”—
that is, to end self-harm entirely—
“is neither necessarily nor inevitably abstinence”,
which cannot be the solution for some people. Rather,
“the extinction of some particular form of behaviour may not be realistic for, or even desired by, the individual”.
So you may find messages that say, “If you are going to cut yourself, use a clean blade. If you do cut yourself, look after the wound afterwards”, but there is a risk that those kinds of well-intentioned, well-meaning and indeed expert messages could be caught by the amendment. I googled self-harm and harm reduction, and the websites that came up included Self Injury Support, which provides expert advice; a number of mental health trusts and healthcare trusts; and, indeed, the royal college’s own website.
The noble Lord, Lord Allan of Hallam, was trying to address this issue with Amendment 268AZC, which would allow the DPP to authorise prosecutions, but it seems to me that a better approach would be to have in the government amendment a statement saying, “We acknowledge that there will be cases where people talk about self-harm in ways that seek to minimise harm rather than simply stopping it, and they are not meant to be caught by this amendment”.
My Lords, as the noble Baroness, Lady Bennett, said, it seems a very long time since we heard the introduction from the noble Lord, Lord Clement-Jones, but it was useful in setting this helpful and well-informed debate on its way. I am sure the whole Committee is keen to hear the Minister introducing the government amendments, even at this very late stage in the debate.
I would like to make reference to a few points. I was completely captivated by the noble Lord, Lord Moylan, who invoked the 10 commandments. I say to him that one can go to no higher order, which I am sure will support the amendments that he and his colleagues have put forward.
I will refer first to the amendments tabled by my noble friend Lady Kennedy. At a minimum, they are interesting because they try to broaden the scope of the current offences. I believe they also try to anticipate the extent of the impact of the government amendments, which in my view would be improved by my noble friend’s amendments. As my noble friend said, so many of the threats that are experienced online by, and directed towards, women and girls are indirect. They are about encouraging others: saying “Somebody should do something terrible to you” is extremely common. I feel that here is an opportunity to address that in the Bill, and if we do not, we will have missed a major aspect. I hope that the Minister will take account of that and be positive. We can all be relaxed about whether the amendments need to be made, but the intent is there.
That part of the debate made a strong case to build on the debate we had on an earlier day in Committee about violence against women and girls, which was led by the noble Baroness, Lady Morgan, and supported by noble Baronesses and noble Lords from all sides of the House. We called upon the Minister then to ensure that the Bill explicitly includes the necessary amendments to make it refer to violence against women and girls because, for all the reasons that my noble friend Lady Kennedy has explained, it is considerably greater for them than for others. Without wishing to dismiss the fact that everybody receives levels of abuse, we have to be realistic here: I believe that my noble friend’s amendments are extremely helpful there.
This is a bit in anticipation of what the Minister will say—I am sure he will forgive me if he already has the answers. The noble Lords, Lord Clement-Jones and Lord Allan, referred particularly to the coalition of some 130 individuals and organisations which have expressed their concerns. I want to highlight those concerns as well, because they speak to some important points. The groups in that coalition include the largest self-harm charity, Self Injury Support, along with numerous smaller self-harm support organisations and, of course, the mental health charity Mind. Their voice is therefore considerable.
To emphasise what has already been outlined, the concern with the current amendments is that they are somewhat broad and equivalent to an offence of glamorising self-harm, which was rejected by the Law Commission in its consultation on the offence. That followed concern from the Magistrates’ Association and the Association of Police and Crime Commissioners that the offence would be ambiguous in application and complex to prosecute. It also risks criminalising people in distress, something that none of us want to see.
In addition, the broadness of the offence risks criminalising peer support and harm reduction resources, by defining them as capable of “encouraging or assisting” when they are in fact intended to help people who self-harm. This was raised by the noble Baroness, Lady Finlay, today and in respect of her Private Member’s Bill, which we debated very recently in this Chamber, and I am sure that it would not be the Minister’s intention.
I would like to emphasise another point that has been made. The offence may also criminalise content posted by people who are in distress and sharing their own experiences of self-harm—the noble Baroness, Lady Finlay, referred to this—by, for example, posting pictures of wounds. We do not want to subject vulnerable people to litigation, so let us not have an offence which ends up harming the very people it aims to protect. I shall be listening closely to the Minister.
There are a number of mitigations which would help, such as the introduction of defences excluding peer support and harm reduction resources, as well as content which has been posted with a view to reducing one’s own serious self-harm. In addition, there could be a mitigation, which I hope we will see, requiring consent from the DPP for any prosecution to occur. That was also suggested by the Law Commission and was picked up by the noble Lord, Lord Allan. There is also the potential for mitigation by including a requirement of malicious intent. This would ensure that offences apply only to instances of trolling or bullying.
I will leave the Minister with a few questions. It would be helpful to hear what consultation there has been with self-harm specific organisations and how the government amendments differ from the broader “glamorisation” offence, which was rejected by the Law Commission. It would also be helpful to hear examples of content that are intended to be criminalised by the offence. That would be of interest to your Lordships’ Committee and the coalition of very key organisations and individuals who are keen, as we all are, to see this Bill end up in the right form and place. I look forward to hearing from the Minister.
My Lords, this has been a broad and mixed group of amendments. I will be moving the amendments in my name, which are part of it. These introduce the new offence of encouraging or assisting serious self-harm and make technical changes to the communications offences. If there can be a statement covering the group and the debate we have had, which I agree has been well informed and useful, it is that this Bill will modernise criminal law for communications online and offline. The new offences will criminalise the most damaging communications while protecting freedom of expression.
Amendments 264A, 266 and 267, tabled by the noble Lord, Lord Clement-Jones, and my noble friend Lady Buscombe, would expand the scope of the false communications offence to add identity theft and financial harm to third parties. I am very grateful to them for raising these issues, and in particular to my noble friend Lady Buscombe for raising the importance of financial harm from fake reviews. This will be addressed through the Digital Markets, Competition and Consumers Bill, which was recently introduced to Parliament. That Bill proposes new powers to address fake and misleading reviews. This will provide greater legal clarity to businesses and consumers. Where fake reviews are posted, it will allow the regulator to take action quickly. The noble Baroness is right to point out the specific scenarios about which she has concern. I hope she will look at that Bill and return to this issue in that context if she feels it does not address her points to her satisfaction.
Identity theft is dealt with by the Fraud Act 2006, which captures those using false identities for their own benefit. It also covers people selling or using stolen personal information, such as banking information and national insurance numbers. Adding identity theft to the communications offences here would duplicate existing law and expand the scope of the offences too broadly. Identity theft, as the noble Lord, Lord Clement-Jones, noted, is better covered by targeted offences rather than communications offences designed to protect victims from psychological and physical harm. The Fraud Act is more targeted and therefore more appropriate for tackling these issues. If we were to add identity theft to Clause 160, we would risk creating confusion for the courts when interpreting the law in these areas—so I hope the noble Lord will be inclined to side with clarity and simplicity.
Amendment 265, tabled by my noble friend Lord Moylan, gives me a second chance to consider his concerns about Clause 160. The Government believe that the clause is necessary and that the threshold of harm strikes the right balance, robustly protecting victims of false communications while maintaining people’s freedom of expression. Removing “psychological” harm from Clause 160 would make the offence too narrow and risk excluding communications that can have a lasting and serious effect on people’s mental well-being.
But psychological harm is only one aspect of Clause 160; all elements of the offence must be met. This includes a person sending a knowingly false message with an intention to cause non-trivial harm, and without reasonable excuse. It has also been tested extensively as part of the Law Commission’s report Modernising Communications Offences, when determining what the threshold of harm should be for this offence. It thus sets a high bar for prosecution, whereby a person cannot be prosecuted solely on the basis of a message causing psychological harm.
The noble Lord, Lord Allan, rightly recalled Section 127 of the Communications Act and the importance of probing issues such as this. I am glad he mentioned the Twitter joke trial—a good friend of mine acted as junior counsel in that case, so I remember it well. I shall spare the blushes of the noble Baroness, Lady Merron, in recalling who the Director of Public Prosecutions was at the time. But it is important that we look at these issues, and I am happy to speak further with my noble friend Lord Moylan and the noble Baroness, Lady Fox, about this and their broader concerns about freedom of expression between now and Report, if they would welcome that.
My noble friend Lord Moylan said that it would be unusual, or novel, to criminalise lying. The offence of fraud by false representation already makes it an offence dishonestly to make a false representation—to breach the ninth commandment—with the intention of making a gain or causing someone else a loss. So, as my noble and learned friend Lord Garnier pointed out, there is a precedent for lies with malicious and harmful intent being criminalised.
Amendments 267AA, 267AB and 268, tabled my noble friend Lady Buscombe and the noble Baroness, Lady Kennedy of The Shaws, take the opposite approach to those I have just discussed, as they significantly lower and expand the threshold of harm in the false and threatening communications offences. The first of these would specify that a threatening communications offence is committed even if someone encountering the message did not fear that the sender specifically would carry out the threat. I am grateful to the noble Baroness for her correspondence on this issue, informed by her work in Scotland. The test here is not whether a message makes a direct threat but whether it conveys a threat—which can certainly cover indirect or implied threats.
I reassure the noble Baroness and other noble Lords that Clause 162 already captures threats of “death or serious harm”, including rape and disfigurement, as well as messages that convey a threat of serious harm, including rape and death threats, or threats of serious injury amounting to grievous bodily harm. If a sender has the relevant intention or recklessness, the message will meet the required threshold. But I was grateful to see my right honourable friend Edward Argar watching our debates earlier, in his capacity as Justice Minister. I mentioned the matter to him and will ensure that his officials have the opportunity to speak to officials in Scotland to look at the work being done with regard to Scots law, and to follow the points that the noble Baroness, Lady Bennett, made about pictures—
I am grateful to the Minister. I was not imagining that the formulations that I played with fulfilled all of the requirements. Of course, as a practising lawyer, I am anxious that we do not diminish standards. I thank the noble Baroness, Lady Fox, for raising concerns about freedom of speech, but this is not about telling people that they are unattractive or ugly, which is hurtful enough to many women and can have very deleterious effects on their self-confidence and willingness to be public figures. Actually, I put the bar reasonably high in describing the acts that I was talking about: threats that somebody would kill, rape, bugger or disfigure you, or do whatever to you. That was the shocking thing: the evidence showed that it was often at that high level. It is happening not just to well-known public figures, who can become somewhat inured to this because they can find a way to deal with it; it is happening to schoolgirls and young women in universities, who get these pile-ons as well. We should reckon with the fact that it is happening on a much wider basis than many people understand.
Yes, we will ensure that, in looking at this in the context of Scots law, we have the opportunity to see what is being done there and that we are satisfied that all the scenarios are covered. In relation to the noble Baroness’s Amendment 268, the intentional encouragement or assistance of a criminal offence is already captured under Sections 44 to 46 of the Serious Crime Act 2007, so I hope that that satisfies her that that element is covered—but we will certainly look at all of this.
I turn to government Amendment 268AZA, which introduces the new serious self-harm offence, and Amendments 268AZB and 268AZC, tabled by the noble Lords, Lord Allan and Lord Clement-Jones. The Government recognise that there is a gap in the law in relation to the encouragement of non-fatal self-harm. The new offence will apply to anyone carrying out an act which intends to, and is capable of, encouraging or assisting another person seriously to self-harm by means of verbal or electronic communications, publications or correspondence.
I say to the noble Baroness, Lady Finlay of Llandaff, that the new clause inserted by Amendment 268AZA is clear that, when a person sends or publishes a communication that is an offence, it is also clear that, when a person forwards on another person’s communication, that will be an offence too. The new offence will capture only the most serious behaviour and avoid criminalising vulnerable people who share their experiences of self-harm. The preparation of these clauses was informed by extensive consultation with interested groups and campaign bodies. The new offence includes two key elements that constrain the offence to the most culpable offending; namely, that a person’s act must be intended to encourage or assist the serious self-harm of another person and that serious self-harm should amount to grievous bodily harm. If a person does not intend to encourage or assist serious self-harm, as will likely be the case with recovery and supportive material, no offence will be committed. The Law Commission looked at this issue carefully, following evidence from the Samaritans and others, and the implementation will be informed by an ongoing consultation as well.
I am sorry to interrupt the Minister, but the Law Commission recommended that the DPP’s consent should be required. The case that the Minister has made on previous occasions in some of the consultations that he has had with us is that this offence that the Government have proposed is different from the Law Commission one, and that is why they have not included the DPP’s consent. I am rather baffled by that, because the Law Commission was talking about a high threshold in the first place, and the Minister is talking about a high threshold of intent. Even if he cannot do so now, it would be extremely helpful to tie that down. As the noble Baroness and my noble friend said, 130 organisations are really concerned about the impact of this.
The Law Commission recommended that the consent, but not the personal consent, of the Director of Public Prosecutions should be required. We believe, however, that, because the offence already has tight parameters due to the requirement for an intention to cause serious self-harm amounting to grievous bodily harm, as I have just outlined, an additional safeguard of obtaining the personal consent of the Director of Public Prosecutions is not necessary. We would expect the usual prosecutorial discretion and guidance to provide sufficient safeguards against inappropriate prosecutions in this area. As I say, we will continue to engage with those groups that have helped to inform the drafting of these clauses as they are implemented to make sure that that assessment is indeed borne out.
Amendment 268AZB aims to apply business disruption enforcement measures to any internet service that “persistently fails to prevent”, or indeed allows, the illegal encouragement of self-harm. As I mentioned earlier in Committee, the Bill significantly reduces the likelihood of users encountering this material on internet sites. It requires all user-to-user services to remove this content and search services to minimise users’ access to it. I hope that that reassures the noble Lords in relation to their amendments to my amendment.
I completely accept that, yes, by requiring the regulated services to prevent access to this kind of content, we will make a significant difference, but it is still the case that there will be—we know there will be, because they exist today—these individual websites, blogs or whatever you want to call them which are not regulated user-to-user services and which are promoting self-harm content. It would be really helpful to know what the Government think should happen to a service such as that, given that it is outside the regulation; it may be persistently breaking the law but be outside our jurisdiction.
I will follow up in writing on that point.
Before I conclude, I will mention briefly the further government amendments in my name, which make technical and consequential amendments to ensure that the communications offences, including the self-harm offence, have the appropriate territorial extent. They also set out the respective penalties for the communications offences in Northern Ireland, alongside a minor adjustment to the epilepsy trolling offence, to ensure that its description is more accurate.
I hope that noble Lords will agree that the new criminal laws that we will make through this Bill are a marked improvement on the status quo. I hope that they will continue to support the government amendments. I express my gratitude to the Law Commission and to all noble Lords—
Just before the Minister sits down—I assume that he has finished his brief on the self-harm amendments; I have been waiting—I have two questions relating to what he said. First, if I heard him right, he said that the person forwarding on is also committing an offence. Does that also apply to those who set up algorithms that disseminate, as opposed to one individual forwarding on to another individual? Those are two very different scenarios. We can see how one individual forwarding to another could be quite targeted and malicious, and we can see how disseminating through an algorithm could have very widespread harms across a lot of people in a lot of different groups—all types of groups—but I am not clear from what he said that that has been caught in his wording.
Secondly—I will ask both questions while I can—I asked the Minister previously why there have been no prosecutions under the Suicide Act. I understood from officials that this amendment creating an offence was to reflect the Suicide Act and that suicide was not included in the Bill because it was already covered as an offence by the Suicide Act. Yet there have been no prosecutions and we have had deaths, so I do not quite understand why I have not had an answer to that.
I will have to write on the second point to try to set that out in further detail. On the question of algorithms, the brief answer is no, algorithms would not be covered in the way a person forwarding on a communication is covered unless the algorithm has been developed with the intention of causing serious self-harm; it is the intention that is part of the test. If somebody creates an algorithm intending people to self-harm, that could be captured, but if it is an algorithm generally passing it on without that specific intention, it may not be. I am happy to write to the noble Baroness further on this, because it is a good question but quite a technical one.
It needs to be addressed, because these very small websites already alluded to are providing some extremely nasty stuff. They are not providing support to people and helping decrease the amount of harm to those self-harming but seem to be enjoying the spectacle of it. We need to differentiate and make sure that we do not inadvertently let one group get away with disseminating very harmful material simply because it has a small website somewhere else. I hope that will be included in the Minister’s letter; I do not expect him to reply now.
Some of us are slightly disappointed that my noble friend did not respond to my point on the interaction of Clause 160 with the illegal content duty. Essentially, what appears to be creating a criminal offence could simply be a channel for hyperactive censorship on the part of the platforms to prevent the criminal offence taking place. He has not explained that interaction. He may say that there is no interaction and that we would not expect the platforms to take any action against offences under Clause 160, or that we expect a large amount of action, but nothing was said.
If my noble friend will forgive me, I had better refresh my memory of what he said—it was some time ago—and follow up in writing.
My Lords, I will be extremely brief. There is much to chew on in the Minister’s speech and this was a very useful debate. Some of us will be happier than others; the noble Baroness, Lady Buscombe, will no doubt look forward to the digital markets Bill and I will just have to keep pressing the Minister on the Data Protection and Digital Information Bill.
There is a fundamental misunderstanding about digital identity theft. It will not necessarily always be fraud that is demonstrated—the very theft of the identity is designed to be the crime, and it is not covered by the Fraud Act 2006. I am delighted that the Minister has agreed to talk further with the noble Baroness, Lady Kennedy, because that is a really important area. I am not sure that my noble friend will be that happy with the response, but he will no doubt follow up with the Minister on his amendments.
The Minister made a very clear statement on the substantive aspect of the group, the new crime of encouraging self-harm, but further clarification is still needed. We will look very carefully at what he said in relation to what the Law Commission recommended, because it is really important that we get this right. I know that the Minister will talk further with the noble Baroness, Lady Finlay, who is very well versed in this area. In the meantime, I beg leave to withdraw my amendment.
Amendment 264A withdrawn.
My Lords, I now have to go through a mass of amendments that are not to be the subject of debate today as they have been debated previously. I will proceed as swiftly as I can.
Amendments 265 to 267 not moved.
Amendment 267A
Moved by
267A: Clause 160, page 138, line 25, leave out from “liable” to end of line 27 and insert “—
(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the maximum term for summary offences or a fine (or both);(b) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding 6 months or a fine not exceeding level 5 on the standard scale (or both).”Member’s explanatory statement
This amendment sets out the penalties for the false communications offence in Northern Ireland, since the offence is now to extend to Northern Ireland as well as England and Wales.
Amendment 267A agreed.
Clause 160, as amended, agreed.
Clause 161 agreed.
Clause 162: Threatening communications offence
Amendments 267AA and 267AB not moved.
Amendments 267B and 267C
Moved by
267B: Clause 162, page 139, line 38, after “conviction” insert “in England and Wales”
Member’s explanatory statement
This amendment adds a reference to England and Wales to differentiate the provision from the similar provision applying to Northern Ireland (see the next amendment in the Minister’s name).
267C: Clause 162, page 139, line 39, at end insert—
“(aa) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding 6 months or a fine not exceeding the statutory maximum (or both);”Member’s explanatory statement
This amendment sets out the penalties for the threatening communications offence in Northern Ireland, since the offence is now to extend to Northern Ireland as well as England and Wales.
Amendments 267B and 267C agreed.
Clause 162, as amended, agreed.
Clause 163: Interpretation of sections 160 to 162
Amendment 268 not moved.
Clause 163 agreed.
Clause 164: Offences of sending or showing flashing images electronically
Amendment 268A
Moved by
268A: Clause 164, page 142, line 30, leave out subsection (14)
Member’s explanatory statement
This is a technical amendment about extent - the extent of the epilepsy trolling offence in clause 164 is now dealt with by amendments of clause 210 (see the amendments of that clause in the Minister’s name).
Amendment 268A agreed.
Clause 164, as amended, agreed.
Amendment 268AZA
Moved by
268AZA: After Clause 164, insert the following new Clause—
“Offence of encouraging or assisting serious self-harm
(1) A person (D) commits an offence if—(a) D does a relevant act capable of encouraging or assisting the serious self-harm of another person, and(b) D’s act was intended to encourage or assist the serious self-harm of another person.(2) D “does a relevant act” if D—(a) communicates in person,(b) sends, transmits or publishes a communication by electronic means,(c) shows a person such a communication,(d) publishes material by any means other than electronic means,(e) sends, gives, shows or makes available to a person—(i) material published as mentioned in paragraph (d), or(ii) any form of correspondence, or(f) sends, gives or makes available to a person an item on which data is stored electronically.(3) “Serious self-harm” means self-harm amounting to—(a) in England and Wales and Northern Ireland, grievous bodily harm within the meaning of the Offences Against the Person Act 1861, and(b) in Scotland, severe injury,and includes successive acts of self-harm which cumulatively reach that threshold.(4) The person referred to in subsection (1)(a) and (b) need not be a specific person (or class of persons) known to, or identified by, D.(5) D may commit an offence under this section whether or not serious self-harm occurs.(6) If a person (D1) arranges for a person (D2) to do an act that is capable of encouraging or assisting the serious self-harm of another person and D2 does that act, D1 is to be treated as also having done it.(7) In the application of subsection (1) to an act by D involving an electronic communication or a publication in physical form, it does not matter whether the content of the communication or publication is created by D (so for example, in the online context, the offence under this section may be committed by forwarding another person’s direct message or sharing another person’s post).(8) In the application of subsection (1) to the sending, transmission or publication by electronic means of a communication consisting of or including a hyperlink to other content, the reference in subsection (2)(b) to the communication is to be read as including a reference to content accessed directly via the hyperlink.(9) In the application of subsection (1) to an act by D involving an item on which data is stored electronically, the reference in subsection (2)(f) to the item is to be read as including a reference to content accessed by means of the item to which the person in receipt of the item is specifically directed by D.(10) A provider of an internet service by means of which a communication is sent, transmitted or published is not to be regarded as a person who sends, transmits or publishes it.(11) Any reference in this section to doing an act that is capable of encouraging the serious self-harm of another person includes a reference to doing so by threatening another person or otherwise putting pressure on another person to seriously self-harm. “Seriously self-harm” is to be interpreted consistently with subsection (3).(12) Any reference to an act in this section, except in subsection (3), includes a reference to a course of conduct, and references to doing an act are to be read accordingly.(13) In subsection (3) “act” includes omission.(14) A person who commits an offence under this section is liable—(a) on summary conviction in England and Wales, to imprisonment for a term not exceeding the general limit in a magistrates’ court or a fine (or both);(b) on summary conviction in Scotland, to imprisonment for a term not exceeding 12 months or a fine not exceeding the statutory maximum (or both);(c) on summary conviction in Northern Ireland, to imprisonment for a term not exceeding 6 months or a fine not exceeding the statutory maximum (or both);(d) on conviction on indictment, to imprisonment for a term not exceeding 5 years or a fine (or both).”Member’s explanatory statement
This amendment inserts a new offence of encouraging or assisting another person to seriously self-harm, with intent to do so, by means of verbal or electronic communications, publications or correspondence.
Amendment 268AZA agreed.
Amendments 268AZB and 268AZC not moved.
Amendment 268AA not moved.
Clause 165: Extra-territorial application and jurisdiction
Amendments 268B to 268F
Moved by
268B: Clause 165, page 142, line 32, leave out subsections (1) and (2)
Member’s explanatory statement
This amendment omits provisions which relate to offences that extended to England and Wales only, as the offences in question are now to extend to Northern Ireland as well.
268C: Clause 165, page 142, line 38, leave out “Section 164(1) applies” and insert “Sections 160(1), 162(1) and 164(1) apply”
Member’s explanatory statement
This amendment, regarding extra-territorial application, is needed because of the extension of the offences in clauses 160 and 162 to Northern Ireland.
268CA: Clause 165, page 142, line 44, at end insert—
“(4A) Section (Offence of encouraging or assisting serious self-harm)(1) applies to an act done outside the United Kingdom, but only if the act is done by a person within subsection (4B).(4B) A person is within this subsection if the person is—(a) an individual who is habitually resident in the United Kingdom, or(b) a body incorporated or constituted under the law of any part of the United Kingdom.”Member’s explanatory statement
This amendment provides for the extra-territorial application of the new offence proposed by the amendment in the Minister’s name to be inserted after clause 164.
268D: Clause 165, page 143, line 1, leave out subsection (5)
Member’s explanatory statement
This amendment omits a provision which relates to offences that extended to England and Wales only, as the offences in question are now to extend to Northern Ireland as well.
268E: Clause 165, page 143, line 4, after “section” insert “160, 162 or”
Member’s explanatory statement
This amendment, regarding extra-territorial jurisdiction, is needed because of the extension of the offences in clauses 160 and 162 to Northern Ireland.
268EA: Clause 165, page 143, line 7, at end insert—
“(6A) Proceedings for an offence committed under section (Offence of encouraging or assisting serious self-harm) outside the United Kingdom may be taken, and the offence may for incidental purposes be treated as having been committed, at any place in the United Kingdom.(6B) In the application of subsection (6A) to Scotland, any such proceedings against a person may be taken, and the offence may for incidental purposes be treated as having been committed—(a) in any sheriff court district in which the person is apprehended or is in custody, or(b) in such sheriff court district as the Lord Advocate may determine.(6C) In subsection (6B) “sheriff court district” is to be construed in accordance with the Criminal Procedure (Scotland) Act 1995 (see section 307(1) of that Act).”Member’s explanatory statement
This amendment is required in order to give courts in the United Kingdom jurisdiction to deal with the new offence proposed by the amendment in the Minister’s name to be inserted after clause 164, if the offence is committed outside the United Kingdom.
268F: Clause 165, page 143, line 8, leave out subsection (7)
Member’s explanatory statement
This is a technical amendment about extent - the extent of clause 165 is now dealt with by amendments of clause 210 (see the amendments of that clause in the Minister’s name).
Amendments 268B to 268F agreed.
Clause 165, as amended, agreed.
Clause 166: Liability of corporate officers
Amendments 268FA to 268G
Moved by
268FA: Clause 166, page 143, line 10, leave out “or 164” and insert “, 164 or (Offence of encouraging or assisting serious self-harm)”
Member’s explanatory statement
This amendment ensures that clause 166, which is about the liability of corporate officers for offences, applies in relation to the new offence proposed by the amendment in the Minister’s name to be inserted after clause 164.
268FB: Clause 166, page 143, line 22, at end insert—
“(2A) If an offence under section (Offence of encouraging or assisting serious self-harm) is committed by a Scottish partnership and it is proved that the offence—(a) has been committed with the consent or connivance of a partner of the partnership, or(b) is attributable to any neglect on the part of a partner of the partnership,the partner (as well as the partnership) commits the offence and is liable to be proceeded against and punished accordingly. (2B) “Partner”, in relation to a Scottish partnership, includes any person who was purporting to act as a partner.”Member’s explanatory statement
This amendment ensures that clause 166, which is about the liability of corporate officers for offences, applies to Scottish partnerships.
268G: Clause 166, page 143, line 23, leave out subsection (3)
Member’s explanatory statement
This is a technical amendment about extent - the extent of clause 166 is now dealt with by amendments of clause 210 (see the amendments of that clause in the Minister’s name).
Amendments 268FA to 268G agreed.
Clause 166, as amended, agreed.
Clause 167: Sending etc photograph or film of genitals
Amendments 269 and 270 not moved.
Clause 167 agreed.
Amendment 271 not moved.
Clause 168: Repeals in connection with offences under sections 160 and 162
Amendments 271A and 271B
Moved by
271A: Clause 168, page 144, line 17, after “Wales” insert “and Northern Ireland”
Member’s explanatory statement
This amendment ensures that section 127(2)(a) and (b) of the Communications Act 2003 is repealed for Northern Ireland as well as England and Wales (because the false communications offence in clause 160 is now to extend to Northern Ireland as well).
271B: Clause 168, page 144, line 22, at end insert—
“(3) The following provisions of the Malicious Communications (Northern Ireland) Order 1988 (S.I. 1988/1849 (N.I. 18)) are repealed—(a) Article 3(1)(a)(ii),(b) Article 3(1)(a)(iii), and(c) Article 3(2).”Member’s explanatory statement
This amendment amends the specified Northern Ireland legislation in consequence of the extension of the false and threatening communications offences to Northern Ireland.
Amendments 271A and 271B agreed.
Clause 168, as amended, agreed.
Clause 169: Consequential amendments
Amendment 271BA
Moved by
271BA: Clause 169, page 144, line 25, at end insert—
“(1A) Part 1A of Schedule 14 contains amendments consequential on section (Offence of encouraging or assisting serious self-harm).” Member’s explanatory statement
This amendment introduces a Part of Schedule 14 containing consequential amendments related to the new offence proposed by the amendment in the Minister’s name to be inserted after clause 164.
Amendment 271BA agreed.
Clause 169, as amended, agreed.
Schedule 14: Amendments consequential on offences in Part 10 of this Act
Amendments 271C to 271F
Moved by
271C: Schedule 14, page 231, line 33, leave out from “2003” to “after” in line 34 and insert “, in the list of offences for England and Wales,”
Member’s explanatory statement
This amendment makes it clearer that changes to the Sexual Offences Act 2003 in paragraph 2 of Schedule 14 to the Bill relate to England and Wales only (since the next amendment in the Minister’s name makes equivalent amendments for Northern Ireland).
271D: Schedule 14, page 231, line 38, at end insert—
“2A_ In Schedule 5 to the Sexual Offences Act 2003, in the list of offences for Northern Ireland, after paragraph 171H insert—“171I_ An offence under section 160 of the Online Safety Act 2023 (false communications).171J_ An offence under section 162 of that Act (threatening communications).””Member’s explanatory statement
This amendment concerns offences relevant to the making of certain orders under the Sexual Offences Act 2003. Now that the false and threatening communications offences under this Bill are to extend to Northern Ireland, this amendment updates the references in Schedule 5 to the Sexual Offences Act that relate to Northern Ireland.
271E: Schedule 14, page 232, line 14, after “sending” insert “or showing”
Member’s explanatory statement
This amendment makes a minor change to the description of the epilepsy trolling offence so that the description is more accurate.
271F: Schedule 14, page 232, line 14, at end insert—
“Part 1AAMENDMENTS CONSEQUENTIAL ON OFFENCE IN SECTION (ENCOURAGING OR ASSISTING SERIOUS SELF-HARM)Children and Young Persons Act 19334A_ In Schedule 1 to the Children and Young Persons Act 1933 (offences against children and young persons with respect to which special provisions of Act apply), after the entry relating to the Suicide Act 1961 insert—“An offence under section (Offence of encouraging or assisting serious self-harm)(1) of the Online Safety Act 2023 (encouraging or assisting serious self-harm) where the relevant act is an act capable of, and done with the intention of, encouraging or assisting the serious self-harm of a child or young person.”Visiting Forces Act 19524B_(1) The Schedule to the Visiting Forces Act 1952 (offences referred to in section 3) is amended as follows.(2) In paragraph 1(b), after paragraph (xv) insert—“(xvi) section (Offence of encouraging or assisting serious self-harm) of the Online Safety Act 2023;”.(3) In paragraph 2(b), after paragraph (iv) insert— “(v) section (Offence of encouraging or assisting serious self-harm) of the Online Safety Act 2023;”.Children and Young Persons Act (Northern Ireland) 1968 (c. 34 (N.I.))4C_ In Schedule 1 to the Children and Young Persons Act (Northern Ireland) 1968 (offences against children and young persons with respect to which special provisions of Act apply), after the entry relating to the Criminal Justice Act (Northern Ireland) 1966 insert—“An offence under section (Offence of encouraging or assisting serious self-harm)(1) of the Online Safety Act 2023 (encouraging or assisting serious self-harm) where the relevant act is an act capable of, and done with the intention of, encouraging or assisting the serious self-harm of a child or young person.”Criminal Attempts Act 19814D_ In section 1 of the Criminal Attempts Act 1981 (attempting to commit an offence), in subsection (4), after paragraph (c) insert—“(d) an offence under section (Offence of encouraging or assisting serious self-harm)(1) of the Online Safety Act 2023 (encouraging or assisting serious self-harm).”Criminal Attempts and Conspiracy (Northern Ireland) Order 1983 (S.I. 1983/ 1120 (N.I. 13))4E_ In Article 3 of the Criminal Attempts and Conspiracy (Northern Ireland) Order 1983 (attempting to commit an offence), in paragraph (4), after sub-paragraph (c) insert—“(ca) an offence under section (Offence of encouraging or assisting serious self-harm)(1) of the Online Safety Act 2023 (encouraging or assisting serious self-harm);”Armed Forces Act 20064F_ In Schedule 2 to the Armed Forces Act 2006 (“Schedule 2 offences”), in paragraph 12, at the end insert—“(ba) an offence under section (Offence of encouraging or assisting serious self-harm) of the Online Safety Act 2023 (encouraging or assisting serious self-harm).”Serious Crime Act 20074G_(1) The Serious Crime Act 2007 is amended as follows.(2) In section 51A (exceptions to section 44 for encouraging or assisting suicide)—(a) the existing text becomes subsection (1);(b) after that subsection insert—“(2) Section 44 does not apply to an offence under section (Offence of encouraging or assisting serious self-harm)(1) of the Online Safety Act 2023 (offence of encouraging or assisting serious self-harm).”;(c) in the heading, at the end insert “or serious self-harm”.(3) In Part 1 of Schedule 3 (listed offences: England and Wales and Northern Ireland), after paragraph 24A insert—“Online Safety Act 202324B_ An offence under section (Offence of encouraging or assisting serious self-harm)(1) of the Online Safety Act 2023 (encouraging or assisting serious self-harm).””Member’s explanatory statement
This amendment makes changes which are consequential on the new offence proposed by the amendment in the Minister’s name to be inserted after clause 164. Among other things, changes are proposed to the Criminal Attempts Act 1981 and the Serious Crime Act 2007 to ensure that offences of attempt and encouragement etc in those Acts do not apply in relation to the new offence, because that offence is itself an inchoate offence.
Amendments 271C to 271F agreed.
Schedule 14, as amended, agreed.
Clause 170: Providers’ judgements about the status of content
Amendments 272 to 283ZA not moved.
Clause 170 agreed.
Clause 171: OFCOM’s guidance about illegal content judgements
Amendment 283A
Moved by
283A: Clause 171, page 145, line 43, at end insert “, and
(b) judgements by providers about whether news publisher content amounts to a relevant offence (see section 14(5) and (10)).”Member’s explanatory statement
This amendment, in effect, re-states the provision currently in clause 14(11), requiring OFCOM’s guidance under clause 171 to cover the judgements described in the amendment.
Amendment 283A agreed.
Amendment 284 not moved.
Clause 171, as amended, agreed.
Clauses 172 to 174 agreed.
Schedule 15 agreed.
Clauses 175 and 176 agreed.
Amendment 284A
Moved by
284A: After Clause 176, insert the following new Clause—
“Offence of failure to comply with confirmation decision: supplementary
(1) Where a penalty has been imposed on a person by a penalty notice under section 126 in respect of a failure constituting an offence under section (Confirmation decisions: offence)(failure to comply with certain requirements of a confirmation decision), no proceedings may be brought against the person for that offence.(2) A penalty may not be imposed on a person by a penalty notice under section 126 in respect of a failure constituting an offence under section (Confirmation decisions: offence) if—(a) proceedings for the offence have been brought against the person but have not been concluded, or(b) the person has been convicted of the offence.(3) Where a service restriction order under section 131 or an access restriction order under section 133 has been made in relation to a regulated service provided by a person in respect of a failure constituting an offence under section (Confirmation decisions: offence), no proceedings may be brought against the person for that offence.” Member’s explanatory statement
This amendment ensures, among other things, that a person cannot be prosecuted for the new offence created by the new clause to be inserted after clause 125 in the Minister’s name if OFCOM have imposed a financial penalty for the same conduct instead, and vice versa.
Amendment 284A agreed.
Clauses 177 to 179 agreed.
Clause 180: Extra-territorial application
Amendments 284B and 284C
Moved by
284B: Clause 180, page 150, line 23, leave out “Section 121(7)” and insert “Sections 121(7) and 137(11)”
Member’s explanatory statement
This amendment adds a reference to clause 137(11) so that that provision (which is about enforcement by civil proceedings) has extra-territorial application.
284C: Clause 180, page 150, line 24, leave out “applies” and insert “apply”
Member’s explanatory statement
This amendment is consequential on the preceding amendment in the Minister’s name.
Amendments 284B and 284C agreed.
Clause 180, as amended, agreed.
Clause 181: Information offences: extra-territorial application and jurisdiction
Amendments 284D to 284F
Moved by
284D: Clause 181, page 150, line 29, at end insert—
“(2A) Section (Confirmation decisions: offence) applies to acts done by a person in the United Kingdom or elsewhere (offence of failure to comply with confirmation decision).”Member’s explanatory statement
This amendment gives wide extra-territorial effect to the new offence created by the new clause to be inserted after clause 125 in the Minister’s name (failure to comply with certain requirements of a confirmation decision).
284E: Clause 181, page 150, line 31, after “subsection (1)” insert “or (2A)”
Member’s explanatory statement
This amendment extends the extra-territorial effect of the new offence of failure to comply with certain requirements of a confirmation decision in the case of senior managers etc who may commit the offence under clause 178(2) or 179(5).
284F: Clause 181, page 150, line 34, leave out “or 101” and insert “, 101 or (Confirmation decisions: offence)”
Member’s explanatory statement
This amendment is required in order to give United Kingdom courts jurisdiction to deal with the new offence of failure to comply with certain requirements of a confirmation decision if it is committed elsewhere.
Amendments 284D to 284F agreed.
Clause 181, as amended, agreed.
Clauses 182 to 184 agreed.
Amendments 285 and 286 not moved.
Land is in sight! I call Amendment 286ZA.
Amendment 286ZA
Moved by
286ZA: After Clause 184, insert the following new Clause—
“Artificial intelligence: labelling of machine-generated content
Within the period of six months beginning with the day on which this Act is passed, the Secretary of State must publish draft legislation with provisions requiring providers of regulated services to put in place systems and processes for—(a) identifying content on their service which is machine-generated, and(b) informing users of the service that such content is machine-generated.”Member’s explanatory statement
This probing amendment is to facilitate a discussion around the potential labelling of machine-generated content, which is a measure being considered in other jurisdictions.
My Lords, that was a bravura performance by the noble Lord, Lord Lexden. We thank him. To those listening in the Public Gallery, I should say that we debated most of those; it was not quite as on the nod as it looked.
Amendment 286ZA, in the name of my noble friend Lord Stevenson, seeks to address a critical issue in our digital landscape: the labelling of AI-generated content on social media platforms.
As we navigate the ever-evolving world of technology, it is crucial that we uphold a transparency safeguarding the principles of honesty and accountability. Social media has become an integral part of our lives, shaping public discourse, disseminating information and influencing public opinion. However, the rise of AI-powered algorithms and tools has given rise to a new challenge: an increasing amount of content generated by artificial intelligence without explicit disclosure.
We live in an age where AI is capable of creating incredibly realistic text, images and even videos that can be virtually indistinguishable from those generated by humans. While this advancement holds immense potential, it also raises concerns regarding authenticity, trust and the ethical implications of AI-generated content. The proposed amendment seeks to address this concern by advocating for a simple but powerful solution—labelling AI-generated content as such. By clearly distinguishing human-generated content from AI-generated content, we empower individuals to make informed decisions about the information they consume, promoting transparency and reducing the potential for misinformation or manipulation.
Labelling AI-generated content serves several crucial purposes. First and foremost, it allows individuals to differentiate between information created by humans and that generated by algorithms in an era where misinformation and deep fakes pose a significant threat to public trust. Such labelling becomes a vital tool to protect and promote digital literacy.
Secondly, it enables users to better understand the potential biases and limitations of AI-generated content. AI algorithms are trained on vast datasets, and without labelling, individuals might unknowingly attribute undue credibility to AI-generated information, assuming it to be wholly objective and reliable. Labelling, however, helps users to recognise the context and provides an opportunity for critical evaluation.
Furthermore, labelling AI-generated content encourages responsible behaviour from the platforms themselves. It incentivises social media companies to develop and implement AI technologies with integrity and transparency, ensuring that users are aware of the presence and influence of AI in their online experiences.
Some may argue that labelling AI-generated content is an unnecessary burden or that it could stifle innovation. However, the intention behind this amendment is not to impede progress but to foster a healthier digital ecosystem built on trust, integrity and informed decision-making. By promoting transparency, we can strike a balance that allows innovation to flourish while safeguarding the interests of individuals and society as a whole.
In conclusion, the amendment to label AI-generated content on social media platforms represents a crucial step forward in addressing the challenges of the digital age. By embracing transparency and empowering individuals, we can foster a more informed and discerning society. Let us lead by example and advocate for a digital landscape that values accountability, integrity and the rights of individuals. I urge your Lordships to support this amendment as we strive to build a future where technology works hand-in-hand with humanity for the betterment of all.
In the spirit of the amendment, I must flag that my entire speaking note was generated by AI, as the noble Lord, Lord Allan, from his expression, had clearly guessed. In using this tool, I do so not to belittle the amendment but to illustrate that these tools are already infiltrating everyday life and can supercharge misinformation. We need to do something to ease internet users in trusting what they read.
Does the noble Lord agree that the fact that we did not notice his speech was generated by AI somewhat damages his argument?
The fact that I labelled it as being AI-generated helped your Lordships to understand, and the transparency eases the debate. I beg to move.
My Lords, I thank the noble Lord, Lord Knight, for laying out the amendment and recognise that there was a very thoughtful debate on the subject of machine-generated content on Amendment 125 in my name on a previous day of Committee.
I appreciate that the concept of labelling or watermarking machine-generated material is central to recent EU legislation, but I am equally aware that there is more than one school of thought on the efficacy of that approach among AI experts. On the one hand, as the noble Lord, Lord Knight, beautifully set out—with the help of his artificial friend—there are those who believe that visibly marking the division of real and altered material is a clue for the public to look more carefully at what they are seeing and that labelling it might provide an opportunity for both creators and digital companies to give greater weight to “human-created material”. For example, it could be that the new BBC Verify brand is given greater validity by the public, or that Google’s search results promote it above material labelled as machine-generated as a more authentic source. There are others who feel that the scale of machine-generated material will be so vast that this labelling will be impossible or that labelling will downgrade the value of very important machine-generated material in the public imagination, when in the very near future it is likely that most human activity will be a blend of generated material and human interaction.
I spent the first part of this week locked in a room with others at the Institute for Ethics in AI in Oxford debating some of these issues. While this is a very live discussion, one thing is clear: if we are to learn from history, we must act now before all is certain, and we should act with pragmatism and a level of humility. It may be that either or both sets of experts are correct.
Industry has clearly indicated that there is an AI arms race, and many companies are launching services that they do not understand the implications of. This is not my view but one told to me by a company leader, who said that the speed of distribution was so great that the testing was confined to whether deploying large language models crashed the platforms; there was no testing for safety.
The noble Lord, Lord Stevenson, says in his explanatory statement that this is a probing amendment. I therefore ask the Minister whether we might meet before Report and look once again at the gaps that might be covered by some combination of Amendment 125 and the amendment in front of us, to make certain that the Bill adequately reflects the concerns raised by the enforcement community and reflects the advice of those who best understand the latest iterations of the digital world.
The Communications Act 2003 made a horrible mistake in not incorporating digital within it; let us not do the same here. Adding explicit safety duties to AI and machine learning would not slow down innovation but would ensure that innovation is not short-sighted and dangerous for humanity. It is a small amendment for what may turn out to be an unimaginably important purpose.
My Lords, it is a pleasure to follow the noble Baroness, Lady Kidron. I will try to keep my remarks brief.
It is extremely helpful that we have the opportunity to talk about this labelling question. I see it more as a kind of aperitif for our later discussion of AI regulation writ large. Given that it is literally aperitif hour, I shall just offer a small snifter as to why I think there may be some challenges around labelling—again, perhaps that is not a surprise to the noble Baroness.
When we make rules, as a general matter we tend to assume that people are going to read them and respond in a rationalist, conformist way. In reality, particularly in the internet space, we often see that there is a mixed environment and there will be three groups. There are the people who will look at the rules and respond in that rational way to them; a large group of people will just ignore them—they will simply be unaware and not at all focused on the rules; and another group will look for opportunities to subvert them and use them to their own advantage. I want to comment particularly on that last group by reference to cutlery and call centres, two historic examples of where rules have been subverted.
On the cutlery example, I am a Sheffielder, and “Made in Sheffield” used to mean that you had made the entire knife in Sheffield. Then we had this long period when we went from knives being made in Sheffield to bringing them to Sheffield and silver-plating them, to eventually just sharpening them and putting them in boxes. That is relevant in the context of AI. Increasingly, if there is an advantage to be gained by appearing to be human, people will look at what kind of finishing you need, so: “The content may have been generated by AI but the button to post it was pushed by a human, therefore we do not think it is AI because we looked at it and posted it”. On the speech of the noble Lord, Lord Knight, does the fact that my noble friend intervened on him and the noble Lord had to use some of his own words now mean that his speech in Hansard would not have to be labelled “AI-generated” because we have now departed from it? Therefore, there is that question of individuals who will want something to appear human-made even if it was largely AI-generated, and whether they will find the “Made in Sheffield” way of bypassing it.
Interestingly, we may see the phenomenon flipping the other way, and this is where my call centres come in. If people go to a popular search engine and type in “SpinVox”, they will see the story of a tech company that promised to transcribe voicemails into written text. This was a wonderful use of technology, and it was valued on the basis that it had developed that fantastic technology. However, it turned out—or at least there were claims, which I can repeat here under privilege—that it was using call centres in low-cost, low-wage environments to type those messages out. Therefore, again, we may see, curiously, some people seeing an advantage to presenting content as AI-generated when it is actually made by humans. That is just to flag that up—as I say, it is a much bigger debate that we are going to have. It is really important that we are having it, and labelling has a role to play. However, as we think about it, I urge that we remember those communities of people who will look at whatever rules we come up with and say, “Aha! Where can I get advantage?”, either by claiming that something is human when it is generated by AI or claiming that it is generated by AI if it suits them when it was actually produced by humans.
My Lords, it is a pleasure to follow the noble Lord, Lord Allan. He reminded me of significant reports of the huge amount of exploitation in the digital sector that has come from identification of photos. A great deal of that is human labour, even though it is often claimed to have been done through machine intelligence.
In speaking to this late but important amendment, I thank the noble Lords, Lord Stevenson and Lord Knight, for giving us the chance to do so, because, as every speaker has said, this is really important. I should declare my position as a former newspaper editor. I distinctly recall teasing a sports journalist in the early 1990s when it was reported that journalists were going to be replaced by computer technology. I said that the sports journalists would be the first to go because they just wrote to a formula anyway. I apologise to sports journalists everywhere.
The serious point behind that is that a lot of extreme, high claims are now being made about so-called artificial intelligence. I declare myself an artificial-intelligence sceptic. What we have now—so-called generative AI—is essentially big data. To quote the science fiction writer, Ted Chiang, what we have is applied statistics. Generative AI relies on looking at what already exists, and it cannot produce anything original. In many respects, it is a giant plagiarism machine. There are huge issues, beyond the scope of the Bill, around intellectual property and the fact that it is not generating anything original.
None the less, it is generating what people in the sector like to describe as hallucinations, which might otherwise be described as errors, falsehoods or lies. This is where quotes are made up; ideas are presented which, at first glance, look as though they make sense but fall apart under examination; and data is actively invented. There is one rather famous case where a lawyer got himself into a great deal of trouble by producing a whole lot of entirely false cases that a bot generated for him. We need to be really careful, and this amendment shows us a way forward in attempting to deal with some of the issues we are facing.
To pick up the points made by the noble Lord, Lord Allan, about the real-world impacts, I was at an event in Parliament this week entitled “The Worker Experience of the AI Revolution”, run by the TUC and Connected by Data. It highlighted what has happened with a lot of the big data exercises already in operation: rather than humans being replaced by robots, people are being forced to act like robots. We heard from Royal Mail and Amazon workers, who are monitored closely and expected to act like machines. That is just one example of the unexpected outcomes of the technologies we have been exercising in recent years.
I will make two final comments. First, I refer to 19th-century Luddite John Booth, who was tortured to death by the state. He was a Luddite, but he was also on the record as saying that new machinery
“might be man’s chief blessing instead of his curse if society were differently constituted”.
History is not pre-written; it is made by the choices, laws and decisions we make in this Parliament. Given where we are at the moment with so-called AI, I urge that caution really is warranted. We should think about putting some caution in the Bill, which is what this amendment points us towards.
My final point relates to an amendment I was not allowed to table because, I was told, it was out of scope. It asked the Secretary of State to report on the climate emissions coming from the digital sector, specifically from artificial intelligence. The noble Baroness, Lady Kidron, said that it will operate on a vast scale. I point out that, already, the digital sector is responsible for 3% of the world’s electricity use and 2% of the world’s carbon emissions, which is about the same as the airline sector. We really need to think about caution. I very much agree with everyone who said that we need to have more discussions on all these issues before Report.
My Lords, this is a real hit-and-run operation from the noble Lord, Lord Stevenson. He has put down an amendment on my favourite subject in the last knockings of the Bill. It is totally impossible to deal with this now—I have been thinking and talking about the whole area of AI governance and ethics for the past seven years—so I am not going to try. It is important, and the advisory committee under Clause 139 should take it into account. Actually, this is much more a question of authenticity and verification than of content. Trying to work out whether something is ChatGPT or GPT-4 content is a hopeless task; you are much more likely to be able to identify whether these are automated users such as chatbots than you are to know about the content itself.
I will leave it there. I missed the future-proofing debate, which I would have loved to have been part of. I look forward to further debates with the noble Viscount, Lord Camrose, on the deficiencies in the White Paper and to the Prime Minister’s much more muscular approach to AI regulation in future.
I am sure that the noble Lord, Lord Stevenson of Balmacara, is smiling over a sherry somewhere about the debate he has facilitated. His is a useful probing amendment and we have had a useful discussion.
The Government certainly recognise the potential challenges posed by artificial intelligence and digitally manipulated content such as deepfakes. As we have heard in previous debates, the Bill ensures that machine-generated content on user-to-user services created by automated tools or machine bots will be regulated where appropriate. Clause 49(4)(b) means that machine-generated content is regulated unless the bot or automated tool producing the content is controlled by the provider of the service.
The labelling of this content via draft legislation is not something to which I can commit today. The Government’s AI regulation White Paper sets out the principles for the responsible development of artificial intelligence in the UK. These principles, such as safety, transparency and accountability, are at the heart of our approach to ensuring the responsible development and use of AI. As set out in the White Paper, we are building an agile approach that is designed to be adaptable in response to emerging developments. We do not wish to introduce a rigid, inflexible form of legislation for what is a flexible and fast-moving technology.
The public consultation on these proposals closed yesterday so I cannot pre-empt our response to it. The Government’s response will provide an update. I am joined on the Front Bench by the Minister for Artificial Intelligence and Intellectual Property, who is happy to meet with the noble Baroness, Lady Kidron, and others before the next stage of the Bill if they wish.
Beyond labelling such content, I can say a bit to make it clear how the Bill will address the risks coming from machine-generated content. The Bill already deals with many of the most serious and illegal forms of manipulated media, including deepfakes, when they fall within scope of services’ safety duties regarding illegal content or content that is potentially harmful to children. Ofcom will recommend measures in its code of practice to tackle such content, which could include labelling where appropriate. In addition, the intimate image abuse amendments that the Government will bring forward will make it a criminal offence to send deepfake images.
In addition to ensuring that companies take action to keep users safe online, we are taking steps to empower users with the skills they need to make safer choices through our work on media literacy. Ofcom, for example, has an ambitious programme of work through which it is funding several initiatives to build people’s resilience to harm online, including initiatives designed to equip people with the skills to identify disinformation. We are keen to continue our discussions with noble Lords on media literacy and will keep an open mind on how it might be a tool for raising awareness of the threats of disinformation and inauthentic content.
With gratitude to the noble Lords, Lord Stevenson and Lord Knight, and everyone else, I hope that the noble Lord, Lord Knight, will be content to withdraw his noble friend’s amendment.
My Lords, I am grateful to everyone for that interesting and quick debate. It is occasionally one’s lot that somebody else tables an amendment but is unavoidably detained in Jerez, drinking sherry, and monitoring things in Hansard while I move the amendment. I am perhaps more persuaded than my noble friend might have been by the arguments that have been made.
We will return to this in other fora in response to the need to regulate AI. However, in the meantime, I enjoyed in particular the John Booth quote from the noble Baroness, Lady Bennett. In respect of this Bill and any of the potential harms around generative AI, if we have a Minister who is mindful of the need for safety by design when we have concluded this Bill then we will have dealt with the bits that we needed to deal with as far as this Bill is concerned.
Can the noble Lord confirm whether he generated those comments himself, or was he on his phone while we were speaking?
I do not have an invisible earpiece feeding me my lines—that was all human-generated. I beg leave to withdraw the amendment.
Amendment 286ZA withdrawn.
Clause 185 agreed.
Schedule 16 agreed.
Clauses 186 and 187 agreed.
Schedule 17: Video-sharing platform services: transitional provision etc
Amendment 286A
Moved by
286A: Schedule 17, page 239, line 36, after “19(2)” insert “and (8A)”
Member’s explanatory statement
This amendment ensures that, during the transitional period when video-sharing platform services continue to be regulated by Part 4B of the Communications Act 2003, providers of such services are not exempt from the new duty in clause 19 to supply records of risk assessments to OFCOM.
Amendment 286A agreed.
Schedule 17, as amended, agreed.
Clause 188: Repeals: Digital Economy Act 2017
Amendment 286B
Moved by
286B: Clause 188, page 154, line 1, after “119(10)” insert “and (11)”
Member’s explanatory statement
This amendment effects the repeal of a provision of the Digital Economy Act 2017 which solely relates to another provision of that Act being repealed.
Amendment 286B agreed.
Clause 188, as amended, agreed.
Clauses 189 to 196 agreed.
Clause 197: Parliamentary procedure for regulations
Amendments 287 to 289 not moved.
Clause 197 agreed.
Amendment 290 not moved.
Clauses 198 to 201 agreed.
Clause 202: “Proactive technology”
Amendments 290A to 290G
Moved by
290A: Clause 202, page 166, line 3, leave out “moderation” and insert “identification”
Member’s explanatory statement
This amendment re-names “content moderation technology” as “content identification technology” as that term is more accurate.
290B: Clause 202, page 166, line 7, leave out “moderation” and insert “identification”
Member’s explanatory statement
This amendment is consequential on the first amendment of clause 202 in the Minister’s name.
290C: Clause 202, page 166, line 9, leave out from “analyses” to end of line 11 and insert “content to assess whether it is content of a particular kind (for example, illegal content).”
Member’s explanatory statement
This amendment revises the definition of content identification technology so that the restrictions in the Bill on OFCOM recommending or requiring the use of proactive technology apply to content identification technology operating on any kind of content.
290D: Clause 202, page 166, line 12, leave out “moderation” and insert “identification”
Member’s explanatory statement
This amendment is consequential on the first amendment of clause 202 in the Minister’s name.
290E: Clause 202, page 167, line 4, leave out “moderation” and insert “identification”
Member’s explanatory statement
This amendment is consequential on the first amendment of clause 202 in the Minister’s name.
290F: Clause 202, page 167, line 9, leave out “moderation” and insert “identification”
Member’s explanatory statement
This amendment is consequential on the first amendment of clause 202 in the Minister’s name.
290G: Clause 202, page 167, leave out lines 15 to 18
Member’s explanatory statement
This amendment is consequential on the first amendment of clause 202 in the Minister’s name.
Amendments 290A to 290G agreed.
Clause 202, as amended, agreed.
Clause 203: Content communicated “publicly” or “privately”
Amendment 290H
Moved by
290H: Clause 203, page 167, line 38, at end insert “, or
(ii) users of another internet service.”Member’s explanatory statement
This amendment concerns the factors that OFCOM must particularly consider when deciding if content is communicated publicly or privately. The change ensures that one such factor is how easily the content may be shared with users of another service.
Amendment 290H agreed.
Clause 203, as amended, agreed.
Clause 204: “Functionality”
Amendments 291 to 293 not moved.
Clause 204 agreed.
Clause 205: “Harm” etc
Amendments 294 and 295 not moved.
Clauses 205 agreed.
Clause 206 agreed.
Amendment 296 not moved.
Clause 207: Interpretation: general
Amendment 297
Moved by
297: Clause 207, page 170, line 13, leave out from “means” to end of line 14 and insert “any system of checking age or age range (including age estimation and age verification);
“age estimation” includes reference to an age range or an age expressed in years;“age verification” means the exact age of a person in years, months, and days or an established date of birth;”Member’s explanatory statement
This amendment defines the meaning of age assurance in the Bill to recognise it includes any test of age including but not limited to verification. Age verification means the exact age of a person in years, months, and days or a date of birth. Age estimation may refer to an age range or an age expressed in years. This is a definition of terms only: the intention is that Ofcom will produce guidance of what level of assurance is required in different settings.
My Lords, we already had a long debate on this subject earlier in Committee. In the interim, many noble Lords associated with these amendments have had conversations with the Government, which I hope will bear some fruit before Report. Today, I want to reiterate a few points that I hope are clarifying to the Committee and the department. In the interests of everyone’s evening plans, the noble Lord, Lord Bethell, and the noble Baroness, Lady Harding, wish to associate themselves with these remarks so that they represent us in our entirety.
For many years, we thought age verification was a gold standard, primarily because it involved a specific government-issued piece of information such as a passport. By the same token, we thought age estimation was a lesser beast, given that it is an estimate by its very nature and that the sector primarily relied on self-declarations with very few checks and balances. In recent years, many approaches to age checking have flourished. Some companies provide age assurance tokens based on facial recognition; others use multiple signals of behaviour, friendship group, parental controls and how you move your body in gameplay; and, only yesterday, I saw the very impressive on-device privacy-preserving age-verification system that Apple rolled out in the US two weeks ago. All of these approaches, used individually and cumulatively, have a place in the age-checking ecosystem, and all will become more seamless over time. But we must ensure that, when they are used, they are adequate for the task they are performing and are quality controlled so that they do not share information about a child, are secure and are effective.
That is why, at the heart of the package of measures put forward in my name and that of the noble Lords, Lord Stevenson and Lord Bethell, and the right reverend Prelate the Bishop of Oxford, are two concepts. First, the method of measuring age should be tech neutral so that all roads can be used. Secondly, there must be robust mechanism of measurement of effectiveness so that only effective systems can be used in high-risk situations, particularly those of primary priority harms such as self-harm and pornography, and that such a measurement will be determined by Ofcom, not industry.
From my work over the last decade and from recent discussion with industry, I am certain that any regime of age assurance must be measurable and hold to certain principles. We cannot create a situation where children’s data is loosely held and liberally shared; we cannot have a system that discriminates against, or does not have automatic appeal mechanisms for, children of colour or those who are 17 or 19, who are at most likelihood of error. Systems should aim to be interoperable and private, not leave traces as children go from one service to another.
Each of the principles of our age-verification package set out in the schedule are of crucial importance. I hope that the Government will see the sense in that because, without them, this age checking will not be trusted. Equally, I urge the Committee to embrace the duality of age verification and estimation that the Government have put forward, because, if a child uses an older sibling’s form of verification and a company understands through the child’s behaviour that they are indeed a child, then we do not want to set up a perverse situation in which the verification is considered of a higher order and they cannot take action based on estimation; ditto, if estimation in gameplay is more accurate than tokens that verify whether someone is over or under 18, it may well be that estimation gives greater assurance that the company will treat the child according to their age.
I hope and believe that, in his response, the Minister will confirm that definitions of age assurance and age estimation will be on the face of the Bill. I also urge him to make a generous promise to accept the full gamut of our concerns about age checking and bring forward amendments in his name on Report that reflect them in full. I beg to move.
My Lords, I associate these Benches with the introduction by the noble Baroness, Lady Kidron, support her amendments and, likewise, hope that they form part of the package that is trundling on its way towards us.
My Lords, what more can I say than that I wish to be associated with the comments made by the noble Baroness and then by the noble Lord, Lord Clement-Jones? I look forward to the Minister’s reply.
I am very grateful to the noble Baroness for her amendment, which is a useful opportunity for us to state publicly and share with the Committee the progress we have been making in our helpful discussions on these issues in relation to these amendments. I am very grateful to her and to my noble friends Lord Bethell and Lady Harding for speaking as one on this, including, as is well illustrated, in this short debate this evening.
As the noble Baroness knows, discussions continue on the precise wording of these definitions. I share her optimism that we will be able to reach agreement on a suitable way forward, and I look forward to working with her, my noble friends and others as we do so.
The Bill already includes a definition of age assurance in Clause 207, which is
“measures designed to estimate or verify the age or age-range of users of a service”.
As we look at these issues, we want to avoid using words such as “checking”, which suggests that providers need to take a proactive approach to checking age, as that may inadvertently preclude the use of technologies which determine age through other means, such as profiling. It is also important that any definition of age assurance does not restrict the current and future use of innovative and accurate technologies. I agree that it is important that there should be robust definitions for terms which are not currently defined in the Bill, such as age verification, and recommit to the discussions we continue to have on what terms need to be defined and the best way to define them.
This has been a very helpful short debate with which to end our deliberations in Committee. I am very grateful to noble Lords for all the points that have been raised over the past 10 days, and I am very glad to be ending in this collaborative spirit. There is much for us still to do, and even more for the Office of the Parliamentary Counsel to do, before we return on Report, and I am grateful to it and to the officials working on the Bill. I urge the noble Baroness to withdraw her amendment.
I beg leave to withdraw the amendment.
Amendment 297 withdrawn.
Amendments 298 to 304 not moved.
Clause 207 agreed.
Clauses 208 and 209 agreed.
Clause 210: Extent
Amendments 304A and 304B
Moved by
304A: Clause 210, page 175, line 24, leave out “Except as provided by subsections (2) to (7)” and insert “Subject to the following provisions of this section”
Member’s explanatory statement
This amendment avoids any implication that the power proposed to be inserted by the amendment of the extent clause in the Minister’s name giving power to extend provisions of the Bill to the Crown Dependencies, and related provisions, are limited in extent to the United Kingdom.
304B: Clause 210, page 175, line 26, leave out subsection (2)
Member’s explanatory statement
This amendment omits a provision in the extent clause which is now dealt with by text inserted by the next three amendments in the Minister’s name.
Amendments 304A and 304B agreed.
Amendment 304C had been withdrawn from the Marshalled List.
Amendment 304CA
Moved by
304CA: Clause 210, page 175, line 29, leave out subsection (3) and insert—
“(3) The following provisions extend to England and Wales and Northern Ireland—(a) sections 160 to 164;(b) section 168(1).” Member’s explanatory statement
This amendment revises the extent clause as a result of changes to the extent of the communications offences in Part 10 of the Bill.
Amendment 304CA agreed.
Amendment 304D had been withdrawn from the Marshalled List.
Amendments 304E to 304K
Moved by
304E: Clause 210, page 175, line 35, leave out subsection (6) and insert—
“(6) The following provisions extend to Northern Ireland only—(a) section 168(3);(b) section 190(7) to (9).”Member’s explanatory statement
This amendment revises the extent clause so that the amendments of Northern Ireland legislation in clause 168 extend to Northern Ireland only.
304F: Clause 210, page 176, line 2, at end insert—
“(7A) His Majesty may by Order in Council provide for any of the provisions of this Act to extend, with or without modifications, to the Bailiwick of Guernsey or to the Isle of Man.(7B) Subsections (1) and (2) of section 196 apply to an Order in Council under subsection (7A) as they apply to regulations under this Act.”Member’s explanatory statement
This amendment provides a power for His Majesty by Order in Council to extend any of the provisions of the Bill to Guernsey or the Isle of Man.
304G: Clause 210, page 176, line 4, leave out from second “to” to end of line 5 and insert “the Bailiwick of Guernsey or the Isle of Man any amendment or repeal made by or under this Act of any part of that Act (with or without modifications).”
Member’s explanatory statement
This amendment has the effect that the power conferred by section 411(6) of the Communications Act 2003 may be exercised so as to extend to Guernsey or the Isle of Man the amendment or repeal of provisions of that Act made by the Bill.
304H: Clause 210, page 176, line 7, leave out “any of the Channel Islands” and insert “the Bailiwick of Guernsey”
Member’s explanatory statement
This amendment has the effect that the power conferred by section 338 of the Criminal Justice Act 2003 may be exercised so as to extend to Guernsey (but not Jersey) the amendment of provisions of that Act made by paragraph 7 of Schedule 14 to the Bill.
304J: Clause 210, page 176, line 10, leave out “any of the Channel Islands” and insert “the Bailiwick of Guernsey”
Member’s explanatory statement
This amendment has the effect that the power conferred by section 60(6) of the Modern Slavery Act 2015 may be exercised so as to extend to Guernsey (but not Jersey) the amendment of Schedule 4 to that Act made by paragraph 9 of Schedule 14 to the Bill.
304K: Clause 210, page 176, line 13, leave out “any of the Channel Islands” and insert “the Bailiwick of Guernsey”
Member’s explanatory statement
This amendment has the effect that the power conferred by section 415(1) of the Sentencing Act 2020 may be exercised so as to extend to Guernsey (but not Jersey) the amendment of Schedule 18 to that Act made by paragraph 10 of Schedule 14 to the Bill.
Amendments 304E to 304K agreed.
Clause 210, as amended, agreed.
Clause 211: Commencement and transitional provision
Amendments 305 and 306 not moved.
Clause 211 agreed.
Clause 212 agreed.
House resumed.
Bill reported with amendments.