Committee (9th Day) (Continued)
Clause 68: Transparency reports about certain Part 3 services
Amendment 160A
Moved by
160A: Clause 68, page 62, line 23, leave out paragraph (d) and insert—
“(d) be made publicly available, subject to appropriate redactions, on the date specified in the notice.”
Member’s explanatory statement
This amendment makes clear that Ofcom guidance under Clause 66 must outline how a platform’s terms of service would be considered “adequate and appropriate”, as required under a new Clause in the name of Lord Stevenson of Balmacara.
My Lords, as we have said many times, this is a complex Bill. As we reflect on the priorities for Report, we can be more relaxed about some of the specifics on how Ofcom may operate, thereby giving it more flexibility—the flexibility it needs to be agile in the online world—if we as a Parliament trust Ofcom. Building trust, I believe, is a triangulation. First, there is independence from government—as discussed in respect of Secretary of State powers. Secondly, we need proper scrutiny by Parliament. Earlier today I talked about my desire for there to be proper post-legislative scrutiny and a permanent Joint Committee to do that. The third leg of the stool is the transparency to assist that scrutiny.
Clause 68 contains the provisions which would require category 1, 2A and 2B services to produce an annual transparency report containing information described by Ofcom in a notice given to the service. Under these provisions, Ofcom would be able to require these services to report on, among other things: information about the incidence of illegal content and content that is harmful to children; how many users are assumed to have encountered this content by means of the service; the steps and processes for users to report this content; and the steps and processes which a provider uses for dealing with this content.
We welcome the introduction of transparency reporting in relation to illegal content and content that is harmful to children. We agree with the Government that effective transparency reporting plays a crucial role in building Ofcom’s understanding of online harms and empowering users to make a more informed choice about the services they use.
However, despite the inclusion of transparency reporting in the Bill representing a step in the right direction, we consider that these requirements could and should be strengthened to do the trust building we think is important. First, the Bill should make clear that, subject to appropriate redactions, companies will be required to make their transparency reports publicly available—to make them transparent—hence Amendment 160A.
Although it is not clear from the Bill whether companies will be required to make these reports publicly available, we consider that, in most instances, such a requirement would be appropriate. As noted, one of the stated purposes of transparency reporting is that it would enable service users to make more informed choices about their own and their children’s internet use—but they can only do so if the reports are published. Moreover, in so far as transparency reporting would facilitate public accountability, it could also act as a powerful incentive for service providers to do more to protect their users.
We also recognise that requiring companies to publish the incidences of CSEA content on their platforms, for instance, may have the effect of encouraging individuals seeking such material towards platforms on which there are high incidences of that content—that must be avoided. I recognise that simply having a high incidence of CSEA content on a platform does not necessarily mean that that platform is problematic; it could just mean that it is better at reporting it. So, as ever with the Bill, there is a balance to be struck.
Therefore, we consider that the Bill should make it explicit that, once provided to Ofcom, transparency reports are to be made publicly available, subject to redactions. To support this, Ofcom should be required to produce guidance on the publication of transparency reports and the redactions that companies should make before making reports publicly accessible. Ofcom should also retain the power to stop a company from publishing a particular transparency report if it considers that the risk of directing individuals to illegal materials outweighs the benefit of making a report public—hence Amendments 160B and 181A.
Amendments 165 and 229 are in my noble friend Lord Stevenson’s name. Amendment 165 would broaden the transparency requirements around user-to-user services’ terms of service, ensuring that information can be sought on the scope of these terms, not just their application. As I understand it, scope is important to understand, as it is significant in informing Ofcom’s regulatory approach. We are trying to guard against minimal terms of service where detail is needed for users and Ofcom.
The proposed clause in Amendment 229 probes how Ofcom will review the effectiveness of the transparency requirements in the Bill. It would require Ofcom to undertake a review of the effectiveness of transparency reports within three years and every five years thereafter, and it would give the Secretary of State powers to implement any recommendations made by the regulator. The Committee should note that we also include a requirement that a Select Committee, charged by the relevant House, must consider and report on the regulations, with an opportunity for Parliament to debate them. So we link the three corners of the triangle rather neatly there.
If we agree that transparency is an important part of building trust in Ofcom in doing this difficult and innovative regulatory job—it is always good to see the noble Lord, Lord Grade, in his place; I know he is looking forward to getting on with this—then this proposed clause is sensible. I beg to move.
My Lords, I am pleased that the noble Lord, Lord Knight of Weymouth, has given us an opportunity to talk about transparency reports with these amendments, which are potentially a helpful addition to the Bill. Transparency is one of the huge benefits that the legislation may bring. One of the concerns that the public have and that politicians have always had with online platforms is that they appear to be a black box—you cannot see what is going on in them.
In the entire edifice that we are constructing in the Online Safety Bill, there are huge opportunities to change that. The platforms will have to do risk assessments —there are measures in the Bill to make sure that information about these is put out—and they will have to take active steps to mitigate any risks they find. Again, we may get directions and guidance from Ofcom that will explain to the public exactly what is expected of them. The final piece of the jigsaw is the transparency reports that show the outcomes—how a platform has performed and what it has done to meet its obligations in dealing with content and behaviour on its services.
For the record, I previously worked for one of the platforms, and I would have said that I was on the pro-transparency wing of the transparency party inside the company. I believed that it was in the platform’s interest: if you do not tell people what you are doing, they will make things up about you, and what they make up will generally be worse than what you are actually doing. So there are huge advantages to the platforms from being transparent.
The noble Lord, Lord Knight, has picked up on some important points in his Amendment 160B, which talks about making sure that the transparency report is not counterproductive by giving the bad guys information that they could use to ill effect. That is a valid point; it is often debated inside the platforms. Sometimes, I argued furiously with my colleagues in the platforms about why we should disclose information. They would ask, “What about the bad guys?” Sometimes I challenged that, but other times it would have been a genuine and accurate concern. The noble Lord mentioned things such as child sexual abuse material, and we have to recognise that the bad guys are incredibly devious and creative, and if you show them anything that they can use against you to get around your systems, they will try to do that. That is a genuine and valid concern.
The sort of thing that you might put into a transparency report is, for example, whether you have banned particular organisations. I would be in favour of indicating to the public that an organisation is banned, but you can see that the potential impact of that is that all the people you are concerned about would create another organisation with a different name and then get back on to your platform. We need to be alive to those kinds of concerns.
It is also relevant to Amendment 165 and the terms of service that the more granular and detailed your terms of service are, the better they are for public information, but there are opportunities to get around them. Again, we would have that argument internally. I would say, “If we are prohibiting specific hate speech terms, tell people that, and then they won’t use them”. For me, that would be a success, as they are not using those hate speech terms anymore, but, of course, they may then find alternative hate speech terms that they can use instead. You are facing that battle all the time. That is a genuine concern that I hope we will be able to debate. I hope that Ofcom will be able to mitigate that risk by discussing with platforms what these transparency reports should look like. In a sense, we are doing a risk assessment of the transparency report process.
Amendment 229 on effectiveness is really interesting. My experience was that if you did not have a transparency report, you were under huge pressure to produce one and that once you produced one, nobody was interested. For fear of embarrassing anyone in the Committee, I would be curious to know how many noble Lords participating in this debate have read the transparency reports already produced by Meta Platforms, Google and others. If they have not read them, they should not be embarrassed, because my experience was that I would talk to regulators and politicians about something they had asked me to come in to talk about, such as hate speech or child sexual abuse material, and I learned to print off the transparency report. I would go in and say, “Well, you know what we are doing; it’s in our transparency report”. They would ask, “What transparency report?”, and I would have to show them. So, having produced a transparency report, every time we published it, we would expect there to be public interest, but little use was made of it. That is not a reason not to do them—as I said, I am very much in favour of doing them—but, on their own, they may not be effective, and Amendment 229 touches on that.
I was trying to think of a collective noun for transparency reports and, seeing as they shed light, I think it may be a “chandelier”. Where we may get the real benefit is if Ofcom can produce a chandelier of transparency reports, taking all the information it gets from the different platforms, processing it and selecting the most relevant information—the reports are often too long for people to work their way through—so that it can enable comparisons. That is really good and it is quite good for the industry that people know that platform A did this, platform B did that, and platform C did something else. They will take note of that, compare with each other and want to get into the best category. It is also critical that Ofcom puts this into user-friendly language, and Ofcom has quite a good record of producing intelligible reports. In the context of Amendment 229, a review process is good. One of the things that might come out of that, thinking ahead, would be Ofcom’s role in producing meta transparency reports, the chandelier that will shed light on what the whole sector is doing.
My Lords, for once I want to be really positive. I am actually very positive about this whole group of amendments because more transparency is essential in what we are discussing. I especially like Amendment 165 from the noble Lord, Lord Stevenson of Balmacara, because it is around terms of service for user-to-user services and ensures that information can be sought on the scope as well as the application. This is important because so much has been put on user-to-user services as well as on terms of service. You need to know what is going on.
I want particularly to compliment Amendment 229 that says that transparency reports should be
“of sufficient quality to enable service users and researchers to make informed judgements”,
et cetera. That is a very elegant way in which to say that they should not be gobbledegook. If we are going to have them, they should be clear and of a quality that we can read. Obviously, we do not want them to be unreadable and full of jargon and legalistic language. I am hoping that that is the requirement.
I am positive because I have been worried that so much depends on terms of service and how that can lead to the overremoval of content and, as we discussed the other day, there is no individual complaints mechanism in terms of Ofcom. So I am grasping for ways in which users can have a right of redress. Understanding why something has been taken down is very important. As the noble Lord, Lord Allan of Hallam, said, so much is hidden from users. People will constantly say, “My material has been deboosted”, and it might well have been. They will say things such as, “The algorithms are hiding content, even if they are not removing it”. I have noticed that people can get very paranoid, and it can fuel conspiracy theories because you get people saying, “Nobody has retweeted my tweet. This is a deboosting algorithm”, when actually they have not retweeted it because it was boring. If you could see a bit more clearly what the policies were instead of feeling that they are hidden from your view, it would lessen the paranoia and that kind of accusation.
My only caveat to this proposal relates to what the noble Lord, Lord Allan, said about the bad guys being banned and that, if they are banned, they might emerge somewhere else. We also need to recognise that sometimes people who are called the bad guys can be banned, and they are not the bad guys. They need to be able to say, “We’re not the bad guys”. That is why the more detail, the better. The only other caveat is that I do not want to be in a situation where we demand endless regulatory complexity and reports and impositions make life impossible for the services in terms of red tape and paperwork. That is my only caveat. Generally speaking, however, I am very positive about these amendments, and I hope that by Report they become, one way or another, part of the Bill.
My Lords, I strongly support the amendment in the names of the noble Lords, Lord Knight and Lord Stevenson, as well as my noble friend Lady Featherstone. The essence of the message from the noble Lord, Lord Knight, about the need for trust and the fact that you can gain trust through greater transparency is fundamental to this group.
The Joint Committee’s report is now a historical document. It is partly the passage of time, but it was an extraordinary way in which to work through some of the issues, as we did. We were very impacted by the evidence given by Frances Haugen, and the fact that certain things came to light only as a result of her sharing information with the Securities and Exchange Commission. We said at the time that:
“Lack of transparency of service providers also means that people do not have insight into the prevalence and nature of activity that creates a risk of harm on the services that they use”.
That is very much the sense that the noble Lord, Lord Stevenson, is trying to get to by adding scope as well.
We were very clear about our intentions at the time. The Government accepted the recommendation that we made and said that they agreed with the committee that
“services with transparency reporting requirements should be required to publish their transparency reports in full, and in an accessible and public place”.
So what we are really trying to do is to get the Government to agree to what they have already agreed to, which we would have thought would be a relatively straightforward process.
There are some other useful aspects, such as the review of effectiveness of the transparency requirements. I very much appreciate what my noble friend just said about not reading transparency reports. I read the oversight reports but not necessarily the transparency reports. I am not sure that Frances Haugen was a great advert for transparency reports at the time, but that is a mere aside in the circumstances.
I commend my noble friend Lady Featherstone’s Amendment 171, which is very consistent with what we were trying to achieve with the code of practice about violence against women and girls. That would fit very easily within that. One of the key points that my noble friend Lord Allan made is that this is for the benefit of the platforms as well. It is not purely for the users. Of course it is useful for the users, but not exclusively, and this could be a way of platforms engaging with the users more clearly, inserting more fresh air into this. In these circumstances it is pretty conclusive that the Government should adhere to what they agreed to in their response to the Joint Committee’s report.
As ever, I thank all noble Lords who have spoken. I absolutely take, accept and embrace the point that transparency is wholly critical to what we are trying to achieve with the Bill. Indeed, the chandelier of transparency reports should be our shared aim—a greenhouse maybe. I am grateful for everyone’s contributions to the debate. I agree entirely with the views expressed. Transparency is vital in holding companies to account for keeping their users safe online. As has been pointed out, it is also to the benefit of the platforms themselves. Confident as I am that we share the same objectives, I would like to try to reassure noble Lords on a number of issues that have been raised.
Amendments 160A, 160B and 181A in the name of the noble Lord, Lord Knight of Weymouth, seek to require providers to make their transparency reports publicly available, subject to appropriate redactions, and to allow Ofcom to prevent their publication where it deems that the risks posed by drawing attention to illegal content outweigh the benefit to the public of the transparency report. Let me reassure the noble Lord that the framework, we strongly believe, already achieves the aim of those amendments. As set out in Clause 68, Ofcom will specify a range of requirements in relation to transparency reporting in a notice to categories 1, 2A and 2B. This will include the kind of information that is required in the transparency report and the manner in which it should be published. Given the requirement to publish the information, this already achieves the intention of Amendment 160A.
The specific information requested for inclusion within the transparency report will be determined by Ofcom. Therefore, the regulator will be able to ensure that the information requested is appropriate for publication. Ofcom will take into account any risks arising from making the information public before issuing the transparency notice. Ofcom will have separate information-gathering powers, which will enable the regulator to access information that is not suitable to be published in the public domain. This achieves the intention of Amendment 160B. There is also a risk of reducing trust in transparency reporting if there is a mechanism for Ofcom to prevent providers publishing their transparency reports.
Amendment 181A would require Ofcom to issue guidance on what information should be redacted and how this should be done. However, Ofcom is already required to produce guidance about transparency reports, which may include guidance about what information should be redacted and how to do this. It is important to provide the regulator with the flexibility to develop appropriate guidance.
Amendment 165 seeks to expand the information within the transparency reporting requirements to cover the scope of the terms of service set out by user-to-user providers. I very much agree with the noble Lord that it is important that Ofcom can request information about the scope of terms of service, as well as about their application. Our view is that the Bill already achieves this. Schedule 8 sets out the high-level matters about which information may be required. This includes information about how platforms are complying with their duties. The Bill will place duties on user-to-user providers to ensure that any required terms of service are clear and accessible. This will require platforms to set out what the terms of service cover—or, in other words, the scope. While I hope that this provides reassurance on the matter, if there are still concerns in spite of what I have said, I am very happy to look at this. Any opportunity to strengthen the Bill through that kind of clarity is worth looking at.
I welcome the Minister’s comments. I am interrupting just because this is my amendment rather than my noble friend Lord Knight’s. The word “scope” caused us some disquiet on this Bench when we were trying to work out what we meant by it. It has been fleshed out in slightly different ways around the Chamber, to advantage.
I go back to the original intention—I am sorry for the extensive introduction, but it is to make sure that I focus the question correctly—which was to make sure that we are not looking historically at the terms of reference that have been issued, and whether they are working in a transparency mode, but addressing the question of what is missing or is perhaps not addressed properly. Does the Minister agree that that would be taken in by the word “scope”?
I think I probably would agree, but I would welcome a chance to discuss it further.
Finally, Amendment 229 intends to probe how Ofcom will review the effectiveness of transparency requirements in the Bill. It would require Ofcom to produce reports reviewing the effectiveness of transparency reports and would give the Secretary of State powers to implement any recommendations made by the regulator. While I of course agree with the sentiment of this amendment, as I have outlined, the transparency reporting power is designed to ensure that Ofcom can continuously review the effectiveness of transparency reports and make adjustments as necessary. This is why the Bill requires Ofcom to set out in annual transparency notices what each provider should include in its reports and the format and manner in which it should be presented, rather than putting prescriptive or static requirements in the Bill. That means that Ofcom will be able to learn, year on year, what will be most effective.
Under Clause 145, Ofcom is required to produce its own annual transparency report, which must include a summary of conclusions drawn from providers’ transparency reports, along with the regulator’s view on industry best practice and other appropriate information—I hope and think that goes to some of the points raised by the noble Lord, Lord Allan of Hallam.
My Lords, just before the Minister moves on—and possibly to save me finding and reading it—can he let us know whether those annual reports by Ofcom will be laid before Parliament and whether Parliament will have a chance to debate them?
I believe so, but I will have to confirm that in writing. I am sorry not to be able to give a rapid answer.
Clause 159 requires the Secretary of State to review in total the operation of the regulatory framework to ensure it is effective. In that review, Ofcom will be a statutory consultee. The review will specifically require an assessment of the effectiveness of the regulatory framework in ensuring that the systems and processes used by services provide transparency and accountability to users.
The Bill will create what we are all after, which is a new culture of transparency and accountability in the tech sector. For the reasons I have laid out, we are confident that the existing provisions are sufficiently broad and robust to provide that. As such, I hope the noble Lord feels sufficiently reassured to withdraw the amendment.
My Lords, that was a good, quick debate and an opportunity for the noble Viscount to put some things on the record, and explain some others, which is helpful. It is always good to get endorsement around what we are doing from both the noble Lord, Lord Allan, and the noble Baroness, Lady Fox. That is a great spread of opinion. I loved the sense of the challenge as to whether anyone ever reads the transparency reports whenever they are published; I imagine AI will be reading and summarising them, and making sure they are not written as gobbledygook.
On the basis of what we have heard and if we can get some reassurance that strong transparency is accompanied by strong parliamentary scrutiny, then I am happy to withdraw the amendment.
Amendment 160A withdrawn.
Amendment 160B not moved.
Clause 68 agreed.
Amendment 161 not moved.
Schedule 8: Transparency reports by providers of Category 1 services, Category 2A services and Category 2B services
Amendments 162 to 181 not moved.
Schedule 8 agreed.
Clause 69: OFCOM’s guidance about transparency reports
Amendment 181A not moved.
Clause 69 agreed.
Amendment 182 not moved.
Clause 70: “Pornographic content”, “provider pornographic content”, “regulated provider pornographic content”
Amendments 183 and 183ZA not moved.
Clause 70 agreed.
Clause 71: Scope of duties about regulated provider pornographic content
Amendment 183A not moved.
Clause 71 agreed.
Clause 72: Duties about regulated provider pornographic content
Amendments 183B to 185 not moved.
Clause 72 agreed.
Clause 73 agreed.
Amendment 185A
Moved by
185A: After Clause 73, insert the following new Clause—
“Duties on providers of online marketplace services
(1) This section sets out duties that apply in relation to providers of online marketplace services.(2) A duty to put in place proportionate systems and processes to prevent child users from encountering listings of knives for sale on the platform, including (where appropriate) excluding relevant listings from advertising or other algorithms.(3) A duty to put in place proportionate systems and processes to identify and remove listings of knives or similar products which are marketed in a manner which would reasonably appear to a user to—(a) promote violence or threatening behaviour,(b) encourage self-harm, or(c) look menacing. (4) A duty to put in place proportionate systems and processes to ensure, beyond reasonable doubt, that any purchaser of a knife meets or exceeds the minimum legal age for purchasing such items.(5) For the purposes of this section, the online marketplace may have regard to different age restrictions in different parts of the United Kingdom.(6) For the purposes of subsection (3)(c), a knife may look menacing if it is, or appears to be similar to, a “zombie knife”, “cyclone knife” or machete.(7) In this section, “online marketplace service” means a service using software, including a website, part of a website or an application, operated by or on behalf of a trader, which allows consumers to conclude distance contracts with other traders or consumers.”Member’s explanatory statement
This new Clause would introduce duties on online marketplaces to limit child access to listings of knives, and to take proactive steps to identify and remove any listings of knives or similar products which refer to violence or self-harm. While online sales of knives are not illegal, under-18s (under-16s in Scotland) should not be able to purchase them.
My Lords, I have Amendments 185A and 268AA in this group. They are on different subjects, but I will deal with them in the same contribution.
Amendment 185A is a new clause that would introduce duties on online marketplaces to limit child access to listings of knives and take proactive steps to identify and remove any listings of knives or products such as ornamental zombie knives that are suggestive of acts of violence or self-harm. I am sure the Minister will be familiar with the Ronan Kanda case that has given rise to our bringing this amendment forward. The case is particularly horrible; as I understand it, sentencing is still outstanding. Two young boys bought ninja blades and machetes online and ultimately killed another younger boy with them. It has been widely featured in news outlets and is particularly distressing. We have had some debate on this in another place.
As I understand it, the Government have announced a consultation on this, among other things, looking at banning the sale of machetes and knives that appear to have no practical use other than being designed to look menacing or suitable for combat. We support the consultation and the steps set out in it, but the amendment provides a chance to probe the extent to which this Bill will apply to the dark web, where a lot of these products are available for purchase. The explanatory statement contains a reference to this, so I hope the Minister is briefed on the point. It would be very helpful to know exactly what the Government’s intention is on this, because we clearly need to look at the sites and try to regulate them much better than they are currently regulated online. I am especially concerned about the dark web.
The second amendment relates to racist abuse; I have brought the subject before the House before, but this is rather different. It is a bit of a carbon copy of Amendment 271, which noble Lords have already debated. It is there for probing purposes, designed to tease out exactly how the Government see public figures, particularly sports stars such as Marcus Rashford and Bukayo Saka, and how they think they are supposed to deal with the torrents of racist abuse that they receive. I know that there have been convictions for racist content online, but most of the abuse goes unpunished. It is not 100% clear that much of it will be identified and removed under the priority offence provisions. For instance, does posting banana emojis in response to a black footballer’s Instagram post constitute an offence, or is it just a horrible thing that people do? We need to understand better how the law will act in this field.
There has been a lot of debate about this issue, it is a very sensitive matter and we need to get to the bottom of it. A year and a half ago, the Government responded to my amendment bringing online racist abuse into the scope of what is dealt with as an offence, which we very much welcomed, but we need to understand better how these provisions will work. I look forward to the Minister setting that out in his response. I beg to move.
My Lords, I rise to speak primarily to the amendments in the name of my noble friend Lord Clement-Jones, but I will also touch on Amendment 268AA at the same time. The amendments that I am particularly interested in are Amendments 200 and 201 on regulatory co-operation. I strongly support the need for this, and I will illustrate that with some concrete examples of why this is essential to bring to life the kinds of challenges that need to be dealt with.
The first example relates to trying to deal with the sexual grooming of children online, where platforms are able to develop techniques to do that. They can do that by analysing the behaviour of users and trying to detect whether older users are consistently trying to approach younger users, and the kind of content of the messages they may be sending to them where that is visible. These are clearly highly intrusive techniques. If a platform is subject to the general data protection regulation, or the UK version of that, it needs to be very mindful of privacy rights. We clearly have, there, two potentially interested bodies in the UK environment. We have the child protection agencies, and we will have, in future, Ofcom seeking to ensure that the platform has met its duty of care, and we will have the Information Commission’s Office.
A platform, in a sense, can be neutral as to what it is instructed to do by the regulator. Certainly, my experience was that the platforms wanted to do those kinds of activities, but they are neutral in the sense that they will do what they are told is legal. There, you need clarity from the regulators together to say, “Yes, we have looked at this and you are not going to do something on the instruction of the child safety agency and then get criticised, and potentially fined, by the Data Protection Agency for doing the thing you have been instructed to do”—so we need those agencies to work together.
The second example is in the area of co-operation around antiterrorism, another key issue. The platforms have created something called the Global Internet Forum to Counter Terrorism. Within that forum, they share tools and techniques—things such as databases of information about terrorist content and systems that you can use to detect them—and you are encouraged within that platform to share those tools and techniques with smaller platforms and competitors. Clearly, again, there is a very significant set of questions, and if you are in a discussion around that, the lawyers will say, “Have the competition lawyers cleared this?” Again, therefore, something that is in the public interest—that all the platforms should be using similar kinds of technology to detect terrorist content—is something where you need a view not just from the counterterrorism people but also, in our case, from the Competition and Markets Authority. So, again, you need those regulators to work together.
The final example is one which I know is dear to the heart of the noble Baroness, Lady Morgan of Cotes, which is fraudsters, which we have dealt with, where you might have patterns of behaviour where you have information that comes from the telecoms companies regulated by Ofcom, the internet service providers, regulated by Ofcom, and financial institutions, regulated by their own family of regulators—and they may want to share data with each other, which is something that is subject to the Information Commission’s Office again. So, again, if we are going to give platforms instructions, which we rightly do in this legislation, and say, “Look, we want you to get tougher on online fraudsters; we want you to demonstrate a duty of care there”, the platforms will need—certainly those regulators: financial regulators, Ofcom and the Information Commissioner’s Office—to sort those things out.
Having a forum such as the one proposed in Amendment 201, where these really difficult issues can be thrashed out and clear guidance can be given to online services, will be much more efficient than what sometimes happened in the past, where you had the left hand and the right hand of the regulatory world pulling you in different directions. I know that we have the Digital Regulation Cooperation Forum. If we can build on those institutions, it is essential and ideal that they have their input before the guidance is issued, rather than have a platform comply with guidance from regulator A and then get dinged by regulator B for doing the thing that they have been instructed to do.
That leads to the very sensible Amendment 201 on skilled persons. Again, Ofcom is going to be able to call in skilled persons. In an area such as data protection, that might be a data protection lawyer, but, equally, it might be that somebody who works at the Information Commissioner’s Office is actually best placed to give advice. Amendment 200—the first of the two that talks about skilled persons being able to come from regulators—makes sense.
Finally, I will touch on the issues raised in Amendment 268AA—I listened carefully and understand that it is a probing amendment. It raises some quite fundamental questions of principle—I suspect that the noble Baroness, Lady Fox, might want to come in on these—and it has been dealt with in the context of Germany and its network enforcement Act: I know the noble Lord, Lord Parkinson of Whitley Bay, can say that in the original German. That Act went in the same direction, motivated by similar concerns around hate speech.
This raises some fundamental questions about what we want from privacy law and what we want in terms of criminal prosecutions. There is a spectrum of offences, and for some I think we have accepted that platforms should report; on child sexual abuse material, platforms have a duty to report every incidence to the regulator. When it comes to threats to life, the expectation would be clear, so if you have knowledge—this happens—of an imminent terrorist attack or even of somebody who is about to commit suicide, it is clear that you should go to the police or the relevant authorities with that information. Then you have this broad spectrum of other criminal offences which may be problematic. I do not want to minimise the effect on people of hate speech crimes, but they are of a different order, shall we say, from threat-to-life cases, where I think reporting is broadly supported. We have to make a decision there.
My starting point is to be nervous about platforms acting in that policing capacity for offences that are not at the most extreme end of the spectrum. Individuals who are worried about that activity can go to the police directly themselves and can generally take the content to the police—literally; they can print it off—who can make a judgment about whether to go to the Crown Prosecution Service. I worry about the platforms doing it partly from a constitutional point of view, because I am not sure that I want them acting in that quasi-legal capacity, but also, frankly, from a volume point of view. The risk is that if you put this duty on a platform, because it is really hard to understand what is criminal hate speech and what is merely hateful hate speech, the temptation will be to send everything over. If you do that, first, you have a greater violation of privacy, and secondly, you probably have not helped the police, because they get swamped with reports that they cannot manage.
I hope that is a helpful counterargument to the idea that platforms should automatically report material. However, I recognise that it leaves an open question. When people engage in that kind of behaviour online and it has serious real-world consequences, how do we make sure that they do not feel that it is consequence-free—that they understand that there are consequences? If they have broken the law, they should be prosecuted. There may be something in streamlining the process where a complainant goes to the police and the police are able to access the information they need, having first assessed that it is worth prosecuting and illegal, so that we make that loop work first before we head in the direction of having platforms report content en masse because they believe it may have violated laws where we are not at that most serious end of the spectrum.
My Lords, so few of us are involved in this discussion that we are now able to write each other’s speeches. I thank the noble Lord, Lord Allan of Hallam, for articulating some of my concerns, probably more elegantly than I will myself. I will focus on two amendments in this group; in fact, there are lots of interesting things, but I will focus on both the amendments from the noble Lord, Lord Bassam of Brighton.
On the issue of proactive steps to remove listings of knives for young people, I am so sympathetic to this because in a different area of my life I am pretty preoccupied with the problem of knife crime among young people. It really bothers me and I worry about how we tackle it. My concern of course is that the police should be working harder to solve that problem and that we cannot anticipate that the Bill will solve all social problems. There is a danger of removing the focus from law enforcement in a real-world problem, as though removing how you buy the knife is the issue. I am not convinced that that helps us.
I wanted to reflect on the kind of dilemmas I am having around this in relation to the story of Mizzy that is doing the rounds. He is the 18 year-old who has been posting his prank videos on TikTok and has caused quite a stir. People have seen him wandering into strangers’ homes uninvited, asking random people in the street if they want to die, running off with an elderly lady’s dog and making fun of Orthodox Jews—generally speaking, this 18 year-old is obnoxious. His TikTok videos have gone viral; everybody is discussing them.
This cruelty for kicks genre of filming yourself, showing your face full to the camera and so on, is certainly abhorrent but, as with the discussion about knife crime, I have noticed that some people outside this House are attempting to blame the technology for the problem, saying that the videos should have been removed earlier and that it is TikTok’s fault that we have this anti-social behaviour, whereas I think it is a much deeper, broader social problem to do with the erosion of adult authority and the reluctance of grown-ups to intervene clearly when people are behaving badly—that is my thesis. It is undoubtedly a police matter. The police seem to have taken ages to locate Mizzy. They eventually got him and charged him with very low offences, so he was on TV being interviewed the other evening, laughing at how weak the law was. Under the laws he was laughing at, he could freely walk into somebody’s house or be obnoxious and get away with it. He said, “We can do what we want”. That mockery throws up problems, but I do not necessarily think that the Bill is the way to solve it.
That leads me to my concerns about Amendment 268AA, because Mizzy was quoted in the Independent newspaper as saying:
“I’m a Black male doing these things and that’s why there’s such an uproar”.
I then went on a social media thread in which any criticism of Mizzy’s behaviour was described as racist harassment. That shows the complexity of what is being called for in Amendment 268AA, which wants platforms to take additional steps
“to combat incidents of online racially aggravated harassment”.
My worry is that we end up with not only Mizzy’s TikTok videos being removed but his critics being removed for racially harassing him, so we have to be very careful here.
Amendment 268AA goes further, because it wants tech companies to push for prosecution. I really think it is a dangerous step to encourage private companies to get tangled up in deciding what is criminal and so on. The noble Lord, Lord Allan, has exactly described my concerns, so I will not repeat them. Maybe I can probe this probing amendment. It also broadens the issue to all forms of harassment.
By the way, the amendment’s explanatory statement mentions the appalling racist abuse aimed at footballers and public figures, but one of the fascinating things was that when we number-crunched and went granular, we found that the majority of that racist abuse seemed to have been generated by bots, which takes us to the position of the noble Lord, Lord Knight, earlier: who would you prosecute in that instance? Bots not even based in the UK were generating what was assumed to be an outbreak of racist abuse among football fans in the UK, but the numbers did not equate to that. There were some people being racist and vile and some things that were generated in these bot farms.
To go back to the amendment, it goes on to broaden the issue out to
“other forms of harassment and threatening or abusive behaviour”.
Again, this is much more complicated in today’s climate, because those kinds of accusation can be deployed for bad faith reasons, particularly against public figures.
We have an example close to this House. I hope that Members have been following and will show solidarity over what has been happening to the noble Baroness, Lady Falkner of Margravine, who is chair of the Equality and Human Rights Commission and tasked with upholding the equality law but is at the centre of a vicious internal row after her officials filed a dossier of complaints about her. They have alleged that she is guilty of harassment. A KC is being brought in, there are 40 complaints and the whole thing is costing a fortune for both taxpayers and the noble Baroness herself.
It coincided with the noble Baroness, Lady Falkner, advising Ministers to update the definition of sex in the Equality Act 2010 to make clear that it refers to biological sex and producing official advice clarifying that trans women can be lawfully excluded from female-only spaces. We know how toxic that whole debate is.
Many of us feel that a lot of the accusations against the noble Baroness are ideologically and politically motivated vexatious complaints. I am distressed to read newspaper reports that say that she has been close to tears and has asked why anyone would go into public service. All this is for the crime of being a regulator upholding and clarifying the law. I hope it does not happen to the person who ends up regulating Ofcom—ending up close to tears as he stands accused of harassment, abusive behaviour and so on.
The point is that she is the one being accused of harassment. I have seen the vile abuse that she has received online. It is completely defamatory, vicious abuse and yet somehow it ends up being that, because she does not provide psychological safety at work and because of her views, she is accused of harassment and is the one in the firing line. I do not want us to introduce that kind of complexity—this is what I have been worried about throughout—into what is banned, removed or sent to the police as examples of harassment or hate crime.
I know that is not the intention of these amendments; it is the unintended consequences that I dread.
My Lords, I will speak chiefly to Amendment 262 in my name, although in speaking after the noble Baroness, Lady Fox, who suggested that the grown-ups should control anti-social behaviour by young people online, I note that there is a great deal of anti-social behaviour online from people of all ages. This is relevant to my Amendment 262.
It is a very simple amendment and would require the Secretary of State to consult with young people by means of an advisory board consisting of people aged 25 and under when reviewing the effectiveness and proportionality of this legislation. This amendment is a practical delivery of some of the discussion we had earlier in this Committee when we were talking about including the Convention on the Rights of the Child in the Bill. There is a commonly repeated phrase, “Nothing about us without us”. It was popularised by disability activists in the 1990s, although in doing a little research for this I found that it originates in Latin in Poland in the 15th century. So it is an idea that has been around for a long while and is seen as a democratic standard. It is perhaps a variation of the old “No taxation without representation”.
This suggestion of an advisory board for the Secretary of State is because we know from the discussion earlier on the children’s rights amendments that globally one in three people online is a child under the age of 18. This comes to the point of the construction of your Lordships’ House. Most of us are a very long way removed in experiences and age—some of us further than others. The people in this Committee thinking about a 12 year-old online now are parents, grandparents and great-grandparents. I venture to say that it is very likely that the Secretary of State is at least a generation older than many of the people who will be affected by its provisions.
This reflects something that I also did on the Health and Care Bill. To introduce an advisory panel of young people reporting directly to the Secretary of State would ensure a direct voice for legislation that particularly affects young people. We know that under-18s across the UK do not have any role in elections to the other place, although 16 and 17 year-olds have a role in other elections in Wales and Scotland now. This is really a simple, clear, democratic step. I suspect the Minister might be inclined to say, “We are going to talk to charities and adults who represent children”. I suggest that what we really need here is a direct voice being fed in.
I want to reflect on a recent comment piece in the Guardian that made a very interesting argument: that there cannot be, now or in the future, any such thing as a digital native. Think of the experience of someone 15 or 20 years ago; yes, they already had the internet but it was a very different beast to what we have now. If we refer back to some of the earlier groups, we were starting to ask what an internet with widespread so-called generative artificial intelligence would look like. That is an internet which is very different from even the one that a 20 year-old is experiencing now.
It is absolutely crucial that we have that direct voice coming in from young people with experience of what it is like. They are an expert on what it is like to be a 12 year-old, a 15 year-old or a 20 year-old now, in a way that no one else can possibly be, so that is my amendment.
I will briefly comment on a couple of other amendments in this group. I am really hoping that the Minister is going to say that the Government will agree with the amendments that replace the gendered term “chairman” with chair. I cannot imagine why we are still writing legislation in 2023 with such gendered terms.
I also want to comment on the amendments from the noble Lord, Lord Stevenson of Balmacara, whom we have not heard from yet. They are Amendments 202ZA and 210A, both of which refer to “journalistic material” and sources. What I want to put on record relates to the Minister’s response to my comments on journalistic sources and encryption on day 3 in Committee. He said then that
“there is no intention or expectation that the tools required to be used under this power would result in a compromising of those sources”.—[Official Report, 27/4/23; col. 1325.]
He was referring to journalistic sources. I have had a great number of journalists and their representatives reaching out to me, pointing to the terms used by the Minister. They have said that if those algorithms, searches or de-encryption tools are let loose, there is no way of being able to say, “That’s a bit of journalism, so the tool’s not going to apply to it”. That simply does not add up. The amendments in this group are getting into that much broader issue, so I look forward to hearing from the noble Lord, Lord Stevenson, on them.
My Lords, this is the most miscellaneous of all the groups that we have had, so it has rightly been labelled as such—and the competition has been pretty strong. I want to come back to the amendments of the noble Lord, Lord Stevenson, and of the noble Lord, Lord Bassam, but first I want to deal with my Amendments 200 and 201 and to put on the record the arguments there.
Again, if I refer back to our joint report, we were strongly of the view—alongside the Communications and Digital Committee—that there should be a statutory requirement for regulators
“to cooperate and consult with one another”.
Although we welcomed the formation of the DRCF, it seemed to us that there should be a much firmer duty. I was pleased to hear the examples that my noble friend put forward of the kinds of co-operation that will be needed. The noble Baroness, Lady Morgan, clearly understands that, particularly in the area of fraud, it could be the FCA or ICO, and it could be Ofcom in terms in social media. There is a range of aspects to this—it could be the ASA.
These bodies need to co-operate. As my noble friend pointed out, they can apparently conflict; therefore, co-operating on the way that they advise those who are subject to regulation is rather important. It is not just about the members of the Digital Regulation Cooperation Forum. Even the IWF and the ASA could be included in that, not to mention other regulators in this analogous space. That forum has rightly been labelled as “Digital”, and digital business is now all-pervasive and involves a huge number of regulatory aspects.
Although in this context Ofcom will have the most relevant powers and expertise, and many regulators will look to it for help in tackling online safety issues, effective public protection will be achieved through proper regulatory co-operation. Therefore, Ofcom should be empowered to co-operate with others to share information. As much as it can, Ofcom should be enabled to work with other regulators and share online safety information with them.
It has been very heartening to see the noble Lord, Lord Grade, in his place, even on a Thursday afternoon, and heartening how Ofcom has engaged throughout the passage of the Bill. We know the skills that it is bringing on board, and with those skills we want it to bring other regulators into its work. It seems that Ofcom is taking the lead on those algorithmic understanding skills, but we need Ofcom to have the duty to co-operate with the other regulators on this as well.
Strangely, in Clause 103 the Bill gives Ofcom the general ability to co-operate with overseas regulators, but it is largely silent on co-operation with UK regulators. Indeed, the Communications Act 2003 limits the UK regulators with which Ofcom can share information, excluding the ICO, for example, which is rather perverse in these circumstances. However, the Bill has a permissive approach to overseas regulators so, again, it should extend co-operation and information-sharing in respect of online safety to include regulators overseeing the offences in Schedule 7 that we have spent some time talking about today—the enforcement authorities, for instance, those responsible for enforcing the offences in relation to priority harms to children and priority offences regarding adults. Elsewhere in regulation, the Financial Conduct Authority may have a general duty to co-operate. The reverse may be true, so that duty of co-operation will need to work both ways.
As my noble friend Lord Allan said, Amendment 200, the skilled persons provision, is very straightforward. It is just to give the formal power to be able to use the expertise from a different regulator. It is a very well-known procedure to bring skilled persons into inquiries, which is exactly what is intended there.
Both amendments tabled by the noble Lord, Lord Bassam, are rather miscellaneous too, but are not without merit, particularly Amendment 185A. Please note that I agree with the noble Baroness, Lady Fox. I 100% support the intention behind the amendment but wonder whether the Bill is the right vehicle for it. No doubt the Minister will answer regarding the scope and how practical it would be. I absolutely applaud the noble Lord for campaigning on this issue. It is extraordinarily important, because we have seen some tragic outcomes of these weapons being available for sale online.
Amendment 268AA, also tabled by the noble Lord, Lord Bassam, is entirely different. Our Joint Committee heard evidence from Edleen John of the FA and Rio Ferdinand about abuse online. It was powerful stuff. I tend to agree with my noble friend. We have talked about user empowerment, the tools for it and, particularly in the context of violence against women and girls, the need for a way to be able to report that kind of abuse or other forms of content online. This is a candidate for that kind of treatment. While platforms obviously need to prevent illegal content and have systems to prevent it and so on, having assessed risk in the way that we have heard about previously, I do not believe that expecting the platforms to pick it up and report it, turning them into a sort of proto-enforcer, is the most effective way. We have to empower users. I absolutely share the objectives set out.
My Lords, when I brought an amendment to a police Bill, my local football club said to me that it was anticipating spending something like £100,000 a year trying to create and develop filters, which were commercially available, to stop its footballers being able to see the abuse that they were getting online. It did that for a very sensible commercial reason because those footballers’ performance was affected by the abuse they got. I want to know how the noble Lord sees this working if not by having some form of intervention that involves the platforms. Obviously, there is a commercial benefit to providers of filters et cetera, but it is quite hard for those who have been victims to see a way to make this useful to them without some external form of support.
I absolutely take what the noble Lord is saying, and I am not saying that the platforms do not have responsibility. Of course they do: the whole Bill is about the platforms taking responsibility with risk assessment, adhering to their terms of service, transparency about how those terms are operating, et cetera. It is purely on the question of whether they need to be reporting that content when it occurs. They have takedown responsibilities for illegal content or content that may be seen by children and so on, but it is about whether they have the duty to report to the police. It may seem a relatively narrow point, but it is quite important that we go with the framework. Many of us have said many times that we regret the absence of “legal but harmful” but, given where we are, we basically have to go with that architecture.
I very much enjoyed listening to the noble Baroness, Lady Bennett. There is no opportunity lost in the course of the Bill to talk about ChatGPT or GPT-4, and that was no exception. It means that we need to listen to how young people are responding to the way that this legislation operates. I am fully in favour of whatever mechanism it may be. It does not need to be statutory, but I very much hope that we do not treat this just as the end of the process but will see how the Bill works out and will listen and learn from experience, and particularly from young people who are particularly vulnerable to much of the content, and the way that the algorithms on social media work.
I am so sorry. With due respect to the noble Lord, Lord Stevenson, the noble Baroness, Lady Bennett, reminded me that his Amendments 202ZA and 210A, late entrants into the miscellaneous group, go very much with the grain that we are trying to get in within the area of encryption. We had quite a long debate about encryption on Clause 110. As ever, the noble Lord has rather cunningly produced something that I think will get us through the eye of the free speech needle. They are two very cunning amendments.
I thank the noble Lord for that. Free expression, my Lords, not free speech.
Freedom of expression.
Yes, freedom of expression. That is right.
I will start where the noble Lord, Lord Clement-Jones, finished, although I want to come back and cover other things. This is a very complicated group. I do not think we can do it quickly, as each issue is important and is worth trying to take forward.
Amendments in my name that came very late have been included here. Unfortunately, we did not have time to degroup them. I think they would have been better on their own, but they are here, and we will have the debate. Amendments 202ZA and 210A look as if they have come from a very different place, but, as the noble Lord, Lord Clement-Jones, and the noble Baroness, Lady Bennett, have said, they are a continuation of the debate we were having a couple of days ago on encryption. They are proposed as a compromise.
I hope that the end result will be that I will not move them, but that we can have an offline meeting about them to see whether there is a way forward on this. We left the debate on the powers the Bill attempts to take on encryption in a slightly unbalanced place. It is clear that, for very good and persuasive reasons, where there may be criminality happening on an encrypted service, powers will have to be available to those responsible for prosecuting that criminal activity so that they can access the necessary evidence. We do not dispute that at all.
How do you do that when it is fully encrypted and breaking the encryption raises dangers and difficulties? How do you do it if you are relying on not fully tested technological solutions which require giving powers to Ofcom to commission and operate through the companies a procedure we do not yet know will exist? It may work for indecent images but almost certainly will not work for counterterrorism. How do you do it in a way which is guided by the principle that the ability to get the data, when the material has been transmitted in an encrypted form, should not be at the expense of freedom of expression? Therefore, a technical solution looks like a possible winner. It may be in the future, but I do not believe we are there yet, but we have been promised a meeting on this topic and I am looking forward to it.
Thinking again about this and having been in receipt of further correspondence from others outside who have been watching this debate very closely, the amendments to Clauses 110 and 112 are suggested as a compromise which might get us to that point. It takes us down a slightly different route—I am not sure that this has been explored with the Bill team and therefore we should have a meeting to discuss it—of trying to dig a bit deeper into what would constitute a reasonable ground for persuading those responsible for hosting encrypted material and arguing convincingly that there is evidence to break their rule of non-interference, at least to the stage of metadata. The key here is not the content itself, but the ability to reach back to those attempting to use encrypted systems for criminal or other illegal behaviour.
In that sense, what is proposed here is a requirement for Ofcom, if it is left with Ofcom. I still believe it would be better if there were a third-party review of that on a judicial level following the RIPA proposals. It is a suggestion which might command support in the industry by allowing for a definitional approach making sure that we are talking about proper journalistic material and allowing that to be taken as a route forward so that there would be reassurance for those who are concerned on the journalistic side that the material would not be accessed and used in a way which would be detrimental to their activity and it could be protected. I will not take it any further than that, unless others would like it. That is the purpose behind this amendment. I am sorry that it was late, and it should not really have been in this group, but it is good to have got it on the table. I hope it will feed into the discussions we are due to have.
Having said that, I will briefly go back to my noble friend Lord Bassam’s amendments. The noble Lord, Lord Clement-Jones, made a couple of the points I wanted to make, but I will reinforce them. I am particularly glad that my noble friend came, given that his throat is as bad as it is—I am sure that it is entirely due to his overenthusiastic support for his football team. No doubt he celebrated late into the night its success in getting into the Europa competition. I never use sporting metaphors, but I have to use one for my noble friend Lord Bassam.
On knives, I am pleased to see this here—we had a good response to it, and I look forward to the Minister’s response. I first came across this issue when I was relatively new in your Lordships’ House. I had a placement with the Met, over a period of time, to get to know how it worked and everything else. It was a fantastic experience that was organised well; anyone who has not done it should do it. It is a good way of learning a bit more about something that is clearly in the public consciousness at the moment.
One of my visits was to a group of young officers operating in and around Brixton. I spent three days there and experienced a riot that they had not anticipated, which was quite exciting. The main point was that we spoke a lot about knives and their role in society. The evidence I saw, in practice, was that this was a burgeoning problem that the police were not well equipped to deal with—this was five or six years ago. It was not for want of trying; it was just that the way the gang culture operated in Brixton, as I understood it, was that the responsibility for enrolling, for maintaining discipline and, subsequently, for operating a gang there was largely governed by rules well away from those recognised in civilised society. The methods of control were knives being placed into the bodies of persons who were being disciplined. The police had no way of coping with that.
Part of the problem with this, as my noble friend Lord Bassam mentioned, is the supply of very unpleasant weapons coming in, usually ordered through the dark web. Again, the police felt that they did not have the equipment, knowledge, skills or even the time to track them down. They were always chasing their tail and were never catching up—they could never keep ahead of it. A really important issue is buried in this amendment; we need to consider it more broadly and society needs to take account of it. If there is an issue within the Bill that should be addressed, it is that. We would like it discussed and hope it will be thought about and implemented if possible.
On Amendment 268AA, I will go back to what the noble Lord, Lord Clement-Jones, said about the evidence we received in the Joint Committee—it was extraordinarily powerful, particularly that from Rio Ferdinand but also that from others who accompanied him on that occasion—about the impact that the internet was having on the health and well-being of players, particularly those affected by abuse after games. He said—I am sure that he will not mind me referring to this—that, before the internet got to the point where it is now, there was still terrible abuse in the stadiums when you were playing, but, because it did not come with you when you left the stadium, you were able to relax, go home and get away from it. But, with the internet, you see it on your timeline and in tweets, and it is sent to you by your friends—and it became impossible and 24/7. It became a real burden, and he saw the impact on younger players—we have seen plenty of evidence of that.
When we reflected in the committee as a result of that evidence, we were working with a version of the Bill that had Clause 11, on legal but harmful content. We were trying to find ways to get a better sense and balance. The committee clearly said that it did not think that legal but harmful was an appropriate way forward, but we certainly also recognised that that meant that a process would have to be in place to deal with material that is not what society wishes to see circulating and influencing the process and young people in particular. We recognised then that there was a problem with having an online safety Act that does not require companies to act on misogynous abuse or hatred being stirred up against coloured or disabled people, to give but two examples of where the gap would emerge. Although we recommended that the legal but harmful clause should be removed, we said that there had to be
“a statutory requirement on providers to have in place proportionate systems and processes to identify and mitigate reasonably foreseeable risks of harm arising from regulated activities defined under the Bill”.
We are not there yet.
Amendment 268AA in the name of my noble friend Lord Bassam gets us a little into the issues of racially aggravated behaviour, harassment and other forms of abuse, and I am very interested to hear what the Government’s response to it will be. I am not sure that we have the tools yet in the Bill, or in the terms of reference approach that has been taken, that would allow that proposal to happen, but maybe the Minister will be able to help us with this when he responds.
I will touch on other amendments in this group. I hope that we will receive a positive response to the amendment from my noble friend Lady Merron, who unfortunately cannot be with us at the moment, in relation to instances of gendered language.
The amendments proposed and spoken to by the noble Lord, Lord Clement-Jones, are important in themselves, but also play to a bigger point, particularly Amendment 201. We do not have much down on this in relation to the question of how Ofcom will relate to other regulators, but the case he made was very persuasive. I hope that the Minister can say something about that. The idea is that we will muddle on with the existing arrangement of informal networking between very powerful regulators—each of whom will have, as the noble Lord said, sometimes conflicting rules about how things go—which can be brokered through a co-operation agreement. But it would be better if it were accompanied by a set of real powers to work together, including joint powers in cases where there are issues affecting both personal data and the impact that Ofcom will have in relation to companies’ operations. We should also recognise that there will be other regulators joining the club every year that will need to be involved and processed. Some basic understanding of the rules—maybe not in the Bill, but certainly forecast to be brought forward in a future piece of legislation—seems to be vital to give them the context with which they can begin to work together and from which we can learn the lessons that will be necessary when new powers are prepared. I am very supportive of Amendment 201, and I hope that there will be a positive thought about how we might take it forward.
The noble Lord, Lord Bethell, is not in his place so, presumably, will not speak to his Amendment 220D, which would give Ofcom powers to delegate some of its regulated powers to another body. Other similar amendments are coming up later, so maybe that point will be picked up then. However, I will put on the record, as the noble Baroness, Lady Stowell, has said on other occasions, that there are problems with simply adding other regulators into what is a very powerful statutory body, particularly if they are not, in any way, public bodies themselves. With no disrespect to those currently working in them, I think that, where are a charitable body or a private company is engaging with bodies such as Ofcom, there should be no question of statutory powers being delegated to them; that must not happen. A form of contract for particular work to be delivered under the control of the statutory body, Ofcom, is fine, but we should not be talking about coequal powers; that would be wrong.
Finally, I do not want to anticipate the Minister in introducing the amendments in his name, but we have no objections to them. I am sure that they will work exactly as he proposes and that they will be acceptable.
My Lords, this has been miscellany, indeed. We must be making progress if we are picking up amendments such as these. I thank noble Lords who have spoken to the amendments and the issues covered in them.
I turn first to Amendment 185A brought to us by the noble Lord, Lord Bassam of Brighton, which seeks to add duties on online marketplaces to limit children’s access to the sale of knives, and proactively to identify and remove listings which appear to encourage the sale of knives for the purposes of violence or self-harm. Tackling knife crime is a priority for His Majesty’s Government; we are determined to crack down on this violent scourge, which is devastating our communities. I hope that he will forgive me for not drawing on the case he mentioned, as it is still sub judice. However, I certainly take the point he makes; we are all too aware of cases like it up and down the country. I received an email recently from Amanda and Stuart Stephens, whose son, Olly, was murdered by two boys, one of whom was armed with a knife. All these cases are very much in our minds as we debate the Bill.
Let me try to reassure them and the noble Lord as well as other Members of the Committee that the Bill, through its existing duties and other laws on the statute book, already achieves what the noble Lord seeks with his amendment. The sale of offensive weapons and of knives to people under the age of 18 are criminal offences. Any online retailer which directly sells these prohibited items can already be held criminally liable. Once in force, the Bill will ensure that technology platforms, including online marketplaces, prevent third parties from using their platform to sell offensive weapons or knives to people under the age of 18. The Bill lists both these offences as priority offences, meaning that user-to-user services, including online marketplaces, will have a statutory obligation proactively to prevent these offences taking place on their services.
I am grateful to the noble Lord, Lord Stevenson of Balmacara, for his support for the government amendments. The Government are committed to ensuring that the regime set up by the Bill is cost-neutral to the taxpayer. As such, it will be funded via annual fees on regulated services the revenue from which is at or above a set revenue threshold. At present, Ofcom is preparing for its new duties as regulator, and funding this by the retention of receipts under the Wireless Telegraphy Act 2006. Once the Bill that we are debating in this Committee is in place, Ofcom will charge fees to recoup this money alongside funding its ongoing costs. As the Bill is still before this Committee, Ofcom has not yet been granted the information-gathering powers which are necessary to prepare the fee regime. This means it cannot issue information requests for financial information from firms. As such, only when the Bill passes will Ofcom be able to begin implementing the fee regime.
In consideration of this, the Government have decided that fees should be charged from the financial year 2025-26 at the earliest. The decision does not affect implementation timings for any other areas of the regime. Amendments 186A to 186C will ensure that the fee regime functions in a fair and practical manner under these timings. The amendments ensure that the costs that Ofcom incurs while preparing for, and exercising, its online safety functions are met by the retention of receipts under the Wireless Telegraphy Act, up until the point when the fee regime is operational. They also ensure that these costs are recovered in a proportionate manner by extending the Schedule 10 recouping regime. I hope that that will have the support of this Committee.
Amendment 200 seeks to expand the definition of “skilled person” to include a regulator or self-regulatory body. I assure the noble Lord, Lord Allan of Hallam, that the Bill’s definition of a “skilled person” is already sufficiently broad to include a regulator or self-regulatory body. As set out in Clause 207(1), the Bill’s existing definition of “person” includes “any organisation or association of persons”. This means that the Bill’s definition of “skilled person” enables Ofcom to appoint an individual, organisation, body of persons or association of persons which appears to it to have the skills necessary to prepare a specific report. That includes a regulator or self-regulatory body, as the noble Lord’s amendment suggests.
On regulatory co-operation, I am conscious that I promised the noble Lord, Lord Russell of Liverpool, further details on this when he asked about it in an earlier grouping, so I shall set them out now so that he can consult them in the official record. I reassure the noble Lord and other noble Lords that Ofcom has strong existing relationships with domestic regulators. That has been supported by the establishment of the Digital Regulation Cooperation Forum, which we have discussed before. Effective regulatory co-ordination is essential for addressing the cross-cutting opportunities and challenges posed by digital technologies and services.
The creation of the forum was a significant step forward in delivering greater coherence at the institutional level and has been widely welcomed by industry and consumer representatives. Its formation has been particularly timely in bringing together the key regulators involved in the proposed new online safety, data and digital competition regimes. Its work has already delivered real and wide-ranging impact, including landmark policy statements clarifying the interactions between digital regulatory regimes, research into cross-cutting issues and horizon-scanning activities on new regulatory challenges. It is important to note that the Information Commissioner’s Office is a member of the forum. We will continue to assess how best to support collaboration between digital regulators and ensure that their approaches are joined up.
In addition, Ofcom already has a statutory footing to share information with UK regulators under the Communications Act 2003. Section 393 of that Act includes provisions for sharing information between Ofcom and other regulators in the UK, such as the Information Commissioner’s Office, the Financial Conduct Authority and the Competition and Markets Authority. So, we believe the issues set out in the amendment are covered.
Let me turn now to the cunning amendments from the noble Lord, Lord Stevenson, which seek to introduce special provisions to apply in cases where a notice issued under Clause 110 would involve the monitoring of journalistic material or material identifying journalistic sources. I appreciate the way he has set those out and I am very happy to have the more detailed discussion with the Bill team that he suggested. Let me just say, though, that the Government are fully committed to protecting the integrity of journalistic material and there is no intention that the technologies required under Clause 110 in relation to private communications would identify anything other than child sexual abuse and exploitation content. These powers are subject to strong safeguards to protect the privacy of all users. Any technologies required on private communications must be accredited by Ofcom as being highly accurate in detecting only child sexual exploitation and abuse content. These minimum standards of accuracy will be approved and published by the Secretary of State following advice from Ofcom and will ensure that it is highly unlikely that journalistic material that is not such content would be erroneously flagged or removed.
I am sorry to interrupt. The Minister has twice given a positive response, but he limited it to child sexual exploitation; he did not mention terrorism, which is in fact the bigger issue. Could he confirm that it is both?
Yes, and as I say, I am happy to talk with the noble Lord about this in greater detail. Under the Bill, category 1 companies will have a new duty to safeguard all journalistic content on their platform, which includes citizen journalism. But I will have to take all these points forward with him in our further discussions.
My noble friend Lord Bethell is not here to move his Amendment 220D, which would allow Ofcom to designate online safety regulatory duties under this legislation to other bodies. We have previously discussed a similar issue relating to the Internet Watch Foundation, so I shall not repeat the points that we have already made.
On the amendments on supposedly gendered language in relation to Ofcom advisory committees in Clauses 139 and 155, I appreciate the intention to make it clear that a person of either sex should be able to perform the role of chairman. The Bill uses the term “chairman” to be consistent with the terminology in the Office of Communications Act 2002, and we are confident that this will have no bearing on Ofcom’s decision-making on who will chair the advisory committees that it must establish, just as, I am sure, the noble Lord’s Amendment 56 does not seek to be restrictive about who might be an “ombudsman”.
I appreciate the intention of Amendment 262 from the noble Baroness, Lady Bennett of Manor Castle. It is indeed vital that the review reflects the experience of young people. Clause 159 provides for a review to be undertaken by the Secretary of State, and published and laid before Parliament, to assess the effectiveness of the regulatory framework. There is nothing in the existing legislation that would preclude seeking the views of young people either as part of an advisory group or in other ways. Moreover, the Secretary of State is required to consult Ofcom and other persons she considers appropriate. In relation to young people specifically, it may be that a number of different approaches will be effective—for example, consulting experts or representative groups on children’s experiences online. That could include people of all ages. The regulatory framework is designed to protect all users online, and it is right that we take into account the full spectrum of views from people who experience harms, whatever their age and background, through a consultation process that balances all their interests.
Amendment 268AA from the noble Lord, Lord Bassam, relates to reporting requirements for online abuse and harassment, including where this is racially motivated—an issue we have discussed in Questions and particularly in relation to sport. His amendment would place an additional requirement on all service providers, even those not in scope of the Bill. The Bill’s scope extends only to user-to-user and search services. It has been designed in this way to tackle the risk of harm to users where it is highest. Bringing additional companies in scope would dilute the efforts of the legislation in this important regard.
Clauses 16 and 26 already require companies to set up systems and processes that allow users easily to report illegal content, including illegal online abuse and harassment. This amendment would therefore duplicate this existing requirement. It also seeks to create an additional requirement for companies to report illegal online abuse and harassment to the Crown Prosecution Service. The Bill does not place requirements on in-scope companies to report their investigations into crimes that occur online, other than child exploitation and abuse. This is because the Bill aims to prevent and reduce the proliferation of illegal material and the resulting harm it causes to so many. Additionally, Ofcom will be able to require companies to report on the incidence of illegal content on their platforms in its transparency reports, as well as the steps they are taking to tackle that content.
I hope that reassures the noble Lord that the Bill intends to address the problems he has outlined and those explored in the exchange with the noble Lord, Lord Clement-Jones. With that, I hope that noble Lords will support the government amendments in this group and be satisfied not to press theirs at this point.
My Lords, I listened very carefully to the Minister’s response to both my amendments. He has gone some way to satisfying my concerns. I listened carefully to the concerns of the noble Baroness, Lady Fox, and noble Lords on the Lib Dem Benches. I am obviously content to withdraw my amendment.
I do not quite agree with the Minister’s point about dilution on the last amendment—I see it as strengthening —but I accept that the amendments themselves slightly stretch the purport of this element of the legislation. I shall review the Minister’s comments and I suspect that I shall be satisfied with what he said.
Amendment 185A withdrawn.
Clauses 74 to 78 agreed.
Clause 79: OFCOM’s fees statements
Amendment 186 not moved.
Amendment 186A
Moved by
186A: Clause 79, page 71, line 20, leave out paragraph (b)
Member’s explanatory statement
This amendment omits a provision about recouping OFCOM’s preparatory costs via fees under Part 6 of the Bill, because it is now intended to recoup all preparatory costs incurred before the fees regime is in operation via the charging of additional fees under Schedule 10 (see also the amendment to Schedule 10 in the Minister’s name).
Amendment 186A agreed.
Clause 79, as amended, agreed.
Clause 80: Recovery of OFCOM’s initial costs
Amendment 186B
Moved by
186B: Clause 80, page 71, line 26, leave out from “incurred” to end of line 27 and insert “before the first day of the initial charging year.”
Member’s explanatory statement
This amendment is to the clause introducing Schedule 10 (recovery of OFCOM’s initial costs). The amendment reflects the change to Schedule 10 proposed by the amendment of that Schedule in the Minister’s name.
Amendment 186B agreed.
Clause 80, as amended, agreed.
Schedule 10: Recovery of OFCOM’s initial costs
Amendment 186C
Moved by
186C: Schedule 10, page 212, line 37, leave out from “before” to end of line 39 and insert “the first day of the initial charging year on—
(a) preparations for the exercise of their online safety functions, or(b) the exercise of their online safety functions;”Member’s explanatory statement
Schedule 10 enables OFCOM to charge additional fees to recover certain online safety costs which are met by the retention of receipts under the Wireless Telegraphy Act 2006. This amendment extends the Schedule 10 regime to cover all costs incurred before the main fees regime under Part 6 of the Bill is in operation (as opposed to only covering preparatory costs incurred before the commencement of clause 79).
Amendment 186C agreed.
Schedule 10, as amended, agreed.
Clause 81 agreed.
Clause 82: General duties of OFCOM under section 3 of the Communications Act
Amendment 187 not moved.
Clause 82 agreed.
Amendment 188 not moved.
Clauses 83 and 84 agreed.
Amendments 189 to 191 not moved.
Clause 85 agreed.
Schedule 11: Categories of regulated user-to-user services and regulated search services: regulations
Amendment 192
Moved by
192: Schedule 11, page 216, line 30, after “service” insert “, including significant risk of harm,”
Member’s explanatory statement
There are some platforms which, whilst attracting small user numbers, are hubs for extreme hateful content and should be regulated as larger user-to-user services.
My Lords, I am very grateful to the noble Baronesses, Lady Parminter and Lady Deech, and the noble Lord, Lord Mann, for their support. After a miscellaneous selection of amendments, we now come back to a group of quite tight amendments. Given the hour, those scheduling the groupings should be very pleased because for the first time we have done all the groups that we set out to do this afternoon. I do not want to tempt fate, but I think we will have a good debate before we head off for a little break from the Bill for a while.
I am very sympathetic to the other amendments in this grouping: Amendments 192A and 194. I think there is a common theme running through them all, unsurprisingly, which I hope my noble friend the Minister will be able to address in his remarks. Some of these amendments come about because we do not know the exact categorisation of the services we are most concerned about in this House and beyond, and how that categorisation process is going to work and be kept under review. That is probably the reason behind this group of amendments.
As noble Lords will be aware, the Bill proposes two major categories of regulated company, category 1 and category 2, and there is another carve-out for search services. Much of the discussion about the Bill has focused on the regulatory requirements for category 1 companies, but—again, we have not seen the list—it is expected that the list of category 1 companies may number only a few dozen, while thousands and thousands of platforms and search engines may not meet that threshold. But some of those other platforms, while attracting small user numbers, are hubs for extremely hateful content. In a previous debate we heard about the vile racist abuse often aimed at particular groups. Some of these platforms are almost outside some of our own experiences. They are deliberately designed to host such hateful content and to try to remain under the radar, but they are undoubtedly deeply influential, particularly to those—often vulnerable—users who access them.
Platforms such as 8kun, 4chan and BitChute are perhaps becoming more well known, whereas Odysee, Rumble and Minds remain somewhat obscure. There are numerous others, and all are easily accessible from anyone’s browser. What does the harm caused by these platforms look like? Some examples are in the public domain. For example, the mass shooting in Buffalo, in America, was carried out by a terrorist whose manifesto was inspired by 4chan’s board and who spoke of its influence on him. Later in this debate we are going to hear about specific content related to suicide, self-harm or eating disorders, which we have already debated in other contexts in these Committee proceedings.
The Center for Countering Digital Hate revealed that the four leading forums it analysed for incels—involuntary celibates—were filled with extreme hatred of women, glorification of violence and active discussion of paedophilia. On Gab, an “anti-Jewish meme repository”, grotesque anti-Semitic caricatures of Jews are shared from an account with an offensive name that seeks to deny the Holocaust. Holocaust denial material is similarly shared across BitChute, where it is also possible to find a video on the supposed
“Jewish Plan To Genocide The White Race”
and, of course, 9/11 conspiracy theories. Meanwhile, on Odysee, other than discussion of the supposed “fake Holocaust” one can find discussion of the “Jewish problem”. On Minds, both President Zelensky and President Putin are condemned for having “kike”—an offensive term for Jews—inner circles, while other posts state that communism is Jewish control and the vessel to destroy our freedom.
The Government and many others know very well that these small, high-harm platforms are a problem. MPs in earlier debates on this Bill raised concerns repeatedly. The noble Lord, Lord Austin, raised this at Second Reading in your Lordships’ House and, nearly a year ago, the then Secretary of State issued a ministerial Statement indicating that, while the Government appreciated that small high-harm platforms do damage,
“more research is required before such platforms can be assigned to the category 1 designation for the online safety regime”.
This was despite Ofcom’s road map for online safety making it clear that it had already identified a number of small platforms that are clearly giving cause for concern.
So the case for action, as set out in my remarks and elsewhere, is proven. The Antisemitism Policy Trust has given evidence to the Joint Committee on the draft Bill and the Bill Committee in another place about this. The Community Security Trust, HOPE not hate and many others have data that demonstrates the level of hateful anti-Semitic and other racist and misogynistic abuse on these platforms. I know others will refer to the work of the Samaritans, the Mental Health Foundation and Beat in raising issues around suicide, self-harm and eating disorder content.
Extraordinarily, these are not platforms where this content is stumbled on or somehow hidden. They are set up deliberately to spread this content, to get people to look at it and to amplify this deeply harmful material. These sites act as feeders for hateful messages and activity on mainstream platforms or as receptors for those directed away from those larger services to niche, hate-filled rabbit holes. We need to think about this as the Bill is implemented. As we hope that the larger platforms will take action and live up to the terms of service they say they have, without action this content will unfortunately disappear to smaller platforms which will still be accessed and have action in the online and offline worlds. I hope my noble friend the Minister will say something about post-implementation in relation to these platforms.
Amendment 192 is a small, technical amendment. It does not compel Ofcom to add burdens to all small platforms but provides a specific recourse for the Secretary of State to consider the risks of harm as part of the process of categorisation. A small number of well-known, small high-harm sites would be required to add what will ultimately be minimal friction and other measures proportionate to their size. They will be required to deliver enhanced transparency. This can only be for the good, given that in some cases these sites are designed specifically to spread harm and radicalise users towards extreme and even terrorist behaviours.
The Government accept that there is a problem. Internet users broadly accept that there is a problem. It must be sensible, in deciding on categorisation, to look at the risk of harm caused by the platforms. I beg to move.
My Lords, I will speak to Amendment 192A. There can be nothing more comfortable within the terms of parliamentary debate than to find oneself cossetted by the noble Baroness, Lady Morgan, on one side and my noble friend Lord Stevenson on the other. I make no apology for repeating the thrust of the argument of the noble Baroness, but I will narrow the focus to matters that she hinted at which we need to think about in a particular way.
We have already debated suicide, self-harm and eating disorder content hosted by category 1 providers. There is a need for the Bill to do more here, particularly through strengthening the user empowerment duties in Clause 12 so that the safest option is the default. We have covered that ground. This amendment seeks to address the availability of this content on smaller services that will fall outside category 1, as the noble Baroness has said. The cut-off conditions under which services will be determined to fall within category 1 are still to be determined. We await further progress on that. However, there are medium-sized and small providers whose activities we need to look at. It is worth repeating—and I am aware that I am repeating—that these include suicide and eating disorder forums, whose main business is the sharing and discussion of methods and encouragement to engage in these practices. In other words, they are set up precisely to do that.
We know that that there are smaller platforms where users share detailed information about methods of suicide. One of these in particular has been highlighted by families and coroners as playing a role in the suicides of individuals in the UK. Regulation 28 reports—that is, an official request for action—have been issued to DCMS and DHSC by coroners to prevent future comparable deaths.
A recent systematic review, looking at the impact of suicide and self-harm-related videos and photographs, showed that potentially harmful content concentrated specifically on sites with low levels of moderation. Much of the material which promotes and glorifies this behaviour is unlikely to be criminalised through the Government’s proposed new offence of encouragement to serious self-harm. For example, we would not expect all material which provides explicit instructional information on how to take one’s life using novel and effective methods to be covered by it.
The content has real-world implications. There is clear evidence that when a particular suicide method becomes better known, the effect is not simply that suicidal people switch from one intended method to the novel one, but that suicides occur in people who would not otherwise have taken their own lives. There are, therefore, important public health reasons to minimise the discussion of dangerous and effective suicide methods.
The Bill’s pre-legislative scrutiny committee recommended that the legislation
“adopt a more nuanced approach, based not just on size and high-level functionality, but factors such as risk, reach, user base, safety performance, and business model”.
This amendment is in line with that recommendation, seeking to extend category 1 regulation to services that carry a high level of risk.
The previous Secretary of State appeared to accept this argument—but we have had a lot of Secretaries of State since—and announced a deferred power that would have allowed for the most dangerous forums to be regulated; but the removal of the “legal but harmful” provisions from the legislation means that this power is no longer applicable, as its function related to the “adult risk assessment” duty, which is no longer in the Bill.
This amendment would not shut down dangerous services, but it would make them accountable to Ofcom. It would require them to warn their users of what they were about to see, and it would require them to give users control over the type of content that they see. That is, the Government’s proposed triple shield would apply to them. We would expect that this increased regulatory burden on small platforms would make them more challenging to operate and less appealing to potential users, and would diminish their size and reach over time.
This amendment is entirely in line with the Government’s own approach to dangerous content. It simply seeks to extend the regulatory position that they themselves have arrived at to the very places where much of the most dangerous content resides. Amendment 192A is supported by the Mental Health Foundation, the Samaritans and others that we have been able to consult. It is similar to Amendment 192, which we also support, but this one specifies that the harmful material that Ofcom must take account of relates to self-harm, suicide and eating disorders. I would now be more than happy to give way—eventually, when he chooses to do it—to my noble friend Lord Stevenson, who is not expected at this moment to use the true and full extent of his abilities at being cunning.
My Lords, I rise to offer support for all the amendments in this group, but I will speak principally to Amendment 192A, to which I have added my name and which the noble Lord, Lord Griffiths, has just explained so clearly. It is unfortunate that the noble Baroness, Lady Parminter, cannot be in her place today. She always adds value in any debate, but on this issue in particular I know she would have made a very compelling case for this amendment. I will speak principally about eating disorders, because the issues of self-harm have already been covered and the hour is already late.
The Bill as it stands presumes a direct relationship between the size of a platform and its potential to cause harm. This is simply not the case: a systematic review which we heard mentioned confirmed what all users of the internet already know—that potentially harmful content is often and easily found on smaller, niche sites that will fall outside the scope of category 1. These sites are absolutely not hard to find—they come up on the first page of a Google search—and some hide in plain sight, masquerading, particularly in the case of eating disorder forums, as sources of support, solace or factual information when in fact they encourage and assist people towards dangerous practices. Without this amendment, those sites will continue spreading their harm and eating disorders will continue to have the highest mortality rate of all mental illnesses in the UK.
I was going to say something about the suicide sites, but the point which comes out of the research has already been made, that when a novel method of suicide becomes more well known, it is not just that people intending to kill themselves switch from one method to another but that the prevalence of suicide increases. As the noble Lord said, this is not just about preventing individual tragedies but is indeed a public health issue.
I very much welcome the steps being taken in the Bill to tackle the prevalence of damaging content, particularly as it applies to children. However, I believe that, as the Bill stands, smaller providers will fly under the radar and vulnerable adults will be harmed—the Bill is extremely light on protections for that category of people. Amendment 192A absolutely seeks to ensure that the Bill tackles content wherever it gives rise to a very high risk of harm, irrespective of the platform’s size. Arguments about regulatory burden on small sites should not apply when health, well-being and lives are at risk. The pre-legislative committee was absolutely alive to this, and its recommendations highlighted the risks here of the small, high-risk companies. As we heard, the previous Secretary of State announced a deferred power but that lapsed when the adult risk assessments were removed.
I fear that the current approach in the Bill will push people who promote this kind of content simply to create smaller platforms where they are beyond the arm of the law. It is not clear whether they would be caught instead by the Government’s new offence of encouraging or assisting serious self-harm. I know we have not debated that yet, but I cannot understand whether encouragement to starvation would be covered by that new offence. It is properly too early to ask the Minister to clarify that, but if he has the answer, I would like to understand that.
We have heard the term “rabbit hole”; there is a rabbit hole, where people intent on self-harm or indeed those who suffer from eating disorders go from larger platforms to smaller and niche ones where they encounter the very content that feeds their addiction, or which fuels and enables their desire to self-harm. As I said in a previous grouping, this cannot be the intention of the Bill, I do not believe it is the intention of the Government, and I hope that the Minister will listen to the arguments that the noble Baroness, Lady Morgan of Cotes, set out so effectively.
My Lords, I am a poor substitute for the noble Baroness, Lady Parminter, in terms of the substance of the issues covered by these amendments, but I am pleased that we have been able to hear from the noble Baroness, Lady Bull, on that. I will make a short contribution on the technology and the challenges of classification, because there are some important issues here that the amendments bring out.
We will be creating rules for categorising platforms. As I understand it, the rules will have a heavy emphasis on user numbers but will not be exclusively linked to user numbers. It would be helpful if the Minister could tease out a little more about how that will work. However, it is right even at this stage to consider the possibility that there will need to be exceptions to those rules and to have a mechanism in place for that.
We need to recognise that services can grow very quickly these days, and some of the highest-risk moments may be those when services have high growth but still very little revenue and infrastructure in place to look after their users. This is a problem generally with stepped models, where you have these great jumps; in a sense, a sliding scale would be more rational, so that responsibilities increase over time, but clearly from a practical view it is hard to do that, so we are going to end up with some kind of step model.
We also need to recognise that, from a technical point of view, it is becoming cheaper and easier to build new user-to-user services all the time. That has been the trend for years, but it is certainly the case now. If someone wants to create a service, they can rent the infrastructure from a number of providers rather than buying it, they can use a lot of code that is freely available—they do not need to write as much code as they used to—and they can promote their new service using all the existing social networks, so you can go from zero to significant user numbers in very quick time, and that is getting quicker all the time. I am interested to hear how the Minister expects such services to be regulated.
The noble Baroness, Lady Morgan, referred to niche platforms. There will be some that have no intention to comply, even if we categorise them as a 2B service. The letter will arrive from Ofcom and go in the bin. They will have no interest whatever. Some of the worst services will be like that. The advantage of us ensuring that we bring them into scope is that we can move through the enforcement process quickly and get to business disruption, blocking, or whatever we need to do to get them out of the UK market. Other niche services will be willing to come into line if they are told they are categorised as 2B but given a reasonable set of requirements. Some of Ofcom’s most valuable work might be precisely to work with them: services that are borderline but recognise that they want to have a viable business, and they do not have a viable business by breaking the law. We need to get hold of them and bring them into the net to be able to work with them.
Finally, there is another group which is very mainstream but in the growing phase and busy growing and not worrying about regulation. For that category of company, we need to work with them as they grow, and the critical thing is to get them early. I think the amendments would help Ofcom to be able get to them early—ideally, in partnership with other regulators, including the European Union, which is now regulating in a similar way under the Digital Services Act. If we can work with those companies as they come into 2B, then into category 1—in European speak, that is a VLOP, a very large online platform—and get them used to the idea that they will have VLOP and category 1 responsibilities before they get there, we can make a lot more progress. Then we can deliver what we are all trying to, which is a safer internet for people in the UK
I shall speak very briefly at this hour, just to clarify as much as anything. It seems important to me that there is a distinction between small platforms and large platforms, but my view has never been that if you are small, you have no potential harms, any more than if you are large, you are harmful. The exception should be the rule. We have to be careful of arbitrary categorisation of “small”. We have to decide who is going to be treated as though they are a large category 1 platform. I keep saying but stress again: do not assume that everybody agrees what significant risk of harm or hateful content is. It is such highly disputed political territory outside the online world and this House that we must recognise that it is not so straightforward.
I am very sympathetic, by the way, to the speeches made about eating disorders and other issues. I see that very clearly, but other categories of speech are disputed and argued over—I have given loads of examples. We end up where it is assumed that the manifestoes of mass shooters appear on these sites, but if you read any of those manifestoes of mass shooters, they will often be quoting from mainstream journalists in mainstream newspapers, the Bible and a whole range of things. Just because they are on 4Chan, or wherever, is not necessarily the problem; it is much more complicated.
I ask the Minister, and the proposers of the amendment, to some extent: would it not be straightforwardly the case that if there is a worry about a particular small platform, it might be treated differently—
I just want to react to the manifestos of mass shooters. While source material such the Bible is not in scope, I think the manifesto of a shooter is clear incitement to terrorism and any platform that is comfortable carrying that is problematic in my view, and I hope it would be in the noble Baroness’s view as well.
I was suggesting that we have a bigger problem than it appearing on a small site. It quotes from mainstream media, but it ends up being broadly disseminated and not because it is on a small site. I am not advocating that we all go round carrying the manifestos of mass shooters and legitimising them. I was more making the point that it can be complicated. Would not the solution be that you can make appeals that a small site is treated differently? That is the way we deal with harmful material in general and the way we have dealt with, for example, RT as press without compromising on press freedom. That is the kind of point I am trying to make.
I understand lots of concerns but I do not want us to get into a situation where we destroy the potential of all smaller platforms—many of them doing huge amounts of social good, part of civil society and all the rest of it—by treating them as though they are large platforms. They just will not have the resources to survive, that is all my point is.
My Lords, I am going to be extremely brief given the extremely compelling way that these amendments have been introduced by the noble Baroness, Lady Morgan, and the noble Lord, Lord Griffiths, and contributed to by the noble Baroness, Lady Bull. I thank her for her comments about my noble friend Lady Parminter. I am sure she would have wanted to be here and would have made a very valuable contribution as she did the other day on exactly this subject.
As the noble Baroness, Lady Fox, has illustrated, we have a very different view of risk across this Committee and we are back, in a sense, into that whole area of risk. I just wanted to say that I think we are again being brought back to the very wise words of the Joint Committee. It may sound like special pleading. We keep coming back to this, and the noble Lord, Lord Stevenson, and I are the last people standing on a Thursday afternoon.
We took a lot of evidence in this particular area. We took the trouble to go to Brussels and had a very useful discussion with the Centre on Regulation in Europe and Dr Sally Broughton Micova. We heard a lot about interconnectedness between some of these smaller services and the impact in terms of amplification across other social media sites.
We heard in the UK from some of the larger services about their concerns about the activities of smaller services. You might say “They would say that, wouldn’t they?” but they were pretty convincing. We heard from HOPE not Hate, the Antisemitism Policy Trust and Stonewall, stressing the role of alternative services.
Of course, we know that these amendments today—some of them sponsored by the Mental Health Foundation, as the noble Lord, Lord Griffiths, said, and Samaritans—have a very important provenance. They recognise that these are big problems. I hope that the Minister will think strongly about this. The injunction from the noble Lord, Lord Allan, to consider how all this is going to work in practice is very important. I very much hope that when we come to consider how this works in practical terms that the Minister will think very seriously about the way in which risk is to the fore— the more nuanced approach that we suggested—and the whole way that profiling by Ofcom will apply. I think that is going to be extremely important as well. I do not think we have yet got to the right place in the Bill which deals with these risky sites. I very much hope that the Minister will consider this in the quite long period between now and when we next get together.
My Lords, this has been a good little debate with some excellent speeches, which I acknowledge. Like the noble Lord, Lord Clement-Jones, I was looking at the Joint Committee’s report. I concluded that one of the first big issues we discussed was how complicated the categorisation seemed in relation to the task that was being set for Ofcom. We comforted ourselves with the thought that if you believe that this is basically a risk-assessment exercise and that all the work Ofcom will subsequently do is driven by its risk assessments and its constant reviewing of them, then the categorisation is bound to fall down because the risks will reveal the things that need to happen.
As we have been through the process in our discussions in Committee, we keep coming across issues where proportionality seems to come around. The proportionality that I worry about is that which says, “If only a small number of people are affected by this, then obviously less needs to be happening at Ofcom level”. We debated this earlier in relation to the amendments on children and seemed to come out in two different positions.
I believe that there should be zero tolerance on whether children should be accessing material which is illegal for them, but the Bill does not say that. It says that all Ofcom’s work has to be done in proportion to the impact, not only in the direct work of trying to mitigate harms or illegality that could occur but taking into account the economic size of the company and the impact that the work would have on its activities. I do not think we can square that off, so I appeal to the Minister, when he comes to respond, to look at it from the other end. Why is it not possible to have a structure which is driven by the risk? If the risk assessment reveals risks that require action, there should not be a constraint simply because the categorisation hurdle has been met. The risk is what matters. Does he agree?
I am grateful to noble Lords for helping us to reach our target for the first time in this Committee, especially to do so in a way which has given us a good debate on which to send us off into the Whitson Recess. I am off to the Isle of Skye, so I will make a special detour to Balmacara in honour of the noble Lord.
May I have a postcard?
The noble Lord does not believe anything that I say at this Dispatch Box, but I will send a postcard.
As noble Lords are by now well aware, all services in scope of the Bill, regardless of their size, will be required to take action against illegal content and all services likely to be accessed by children must put in place protections for children. Companies designated as category 1 providers have significant additional duties. These include the overarching transparency, accountability and freedom of expression duties, as well as duties on content of democratic importance, news publishers’ content, journalistic content and fraudulent advertising. It is right to put such duties only on the largest platforms with features enabling the greatest reach, as they have the most significant influence over public discourse online.
I turn first to Amendment 192 in the name of my noble friend Lady Morgan of Cotes and Amendment 192A from the noble Lord, Lord Griffiths of Burry Port, which are designed to widen category 1 definitions to include services that pose a risk of harm, regardless of their number of users. Following removal of the legal but harmful provisions in another place, the Bill no longer includes the concept of risk of harm in Category 1 designation. As we set out, it would not be right for the Government to define what legal content it considers harmful to adults, and it follows that it would not be appropriate for the Government to categorise providers and to require them to carry out duties based on this definition.
In addition, requiring all companies to comply with the full range of Category 1 duties would pose a disproportionate burden on services which do not exert the same influence over public discourse online. I appreciate the point made by the noble Baroness, Lady Bull, with regard to regulatory burden. There is a practical element to this as well. Services, particularly smaller ones, have finite resources. Imposing additional duties on them would divert them from complying with their illegal and child safety duties, which address the most serious online harms. We do not want to weaken their ability to tackle criminal activity or to protect children.
As we discussed in detail in a previous debate, the Bill tackles suicide and self-harm content in a number of ways. The most robust protections in the Bill are for children, while those for adults strike a balance between adults being protected from illegal content and given more choice over what legal content they see. The noble Lord, Lord Stevenson, asked why we do not start with the highest risk rather than thinking about the largest services, but we do. We start with the most severe harms—illegal activity and harm to children. We are focusing on the topics of greatest risk and then, for other categories, allowing adults to make decisions about the content with which they interact online.
A number of noble Lords referred to suicide websites and fora. We are concerned about the widespread availability of content online which promotes and advertises methods of suicide and self-harm, which can be easily accessed by young or vulnerable people. Under the Bill, where suicide and self-harm websites host user-generated content, they will be in scope of the legislation. These sites will need proactively to prevent users from being exposed to priority illegal content, including content which encourages or assists suicide under the terms of the Suicide Act 1961. Additionally, it is an offence under Section 4(3) of the Misuse of Drugs Act 1971 for a website to offer to sell controlled drugs to consumers in England and Wales. Posting advice on how to obtain such drugs in England and Wales is also likely to be an offence, regardless of where the person providing the advice is located.
The Bill also limits the availability of such content by placing illegal content duties on search services, including harmful content which affects children or where this content is shared on user-to-user services. This will play a key role in reducing traffic that directs people to websites which encourage or assist suicide, and reduce the likelihood of users encountering such content. The noble Baroness, Lady Bull, asked about starvation. Encouraging people to starve themselves or not to take prescribed medication will be covered.
Amendment 194 tabled by the noble Lord, Lord Stevenson of Balmacara, seeks to ensure that Ofcom can designate companies as category 1, 2A or 2B on a provisional basis, when it considers that they are likely to meet the relevant thresholds. This would mean that the relevant duties can be applied to them, pending a full assessment by Ofcom. The Government recognise the concern highlighted by the noble Lord, Lord Allan, about the rapid pace of change in the technology sector and how that can make it challenging to keep the register of the largest and most influential services up to date. I assure noble Lords that the Bill addresses this with a duty which the Government introduced during the Bill’s recommittal in another place. This duty, at Clause 88, requires Ofcom proactively to identify and publish a list of companies which are close to category 1 thresholds. This will reduce any delays in Ofcom adding additional obligations on companies which grow rapidly, or which introduce new high-risk features. It will also ensure that the regime remains agile and adaptable to emerging threats.
Platforms with the largest reach and greatest influence over public discourse will be designated as category 1. The Bill sets out a clear process for determining category 1 providers, based on thresholds relating to these criteria, which will be set by the Secretary of State in secondary legislation. The process has been designed to ensure that it is transparent and evidence-based. We expect the main social media platforms and possibly some others to be designated as category 1 services, but we do not wish to prejudge the process set out above by indicating which specific services are likely to be designated, as I have set out on previous groups.
The amendment would enable Ofcom to place new duties on companies without due process. Under the approach that we take in the Bill, Ofcom can designate companies as belonging to each category based only on an objective assessment of evidence against thresholds approved by Parliament. The Government’s approach also provides greater certainty for companies, as is proposed in this amendment. We have heard concerns in previous debates about when companies will have the certainty of knowing their category designation. These amendments would introduce continuous uncertainty and subjectivity into the designation process and would give Ofcom significant discretion over which companies should be subject to which duties. That would create a very uncertain operating environment for businesses and could reduce the attractiveness of the UK as a place to do business.
I hope that explains why we are not taken by these amendments but, in the spirit of the Whitsun Recess, I will certainly think about them on the train as I head north. I am very happy to discuss them with noble Lords and others between now and our return.
Before the Minister sits down, he did let slip that he was going on the sleeper, so I do not think that there will be much thinking going on—although I did not sleep a wink the last time I went, so I am sure that he will have plenty of time.
I am sure that the noble Baroness, Lady Morgan, will want to come in—but could he repeat that again? Risk assessment drives us, but the risk assessment for a company that will not be regarded as a category 1 provider because it does not meet categorisation thresholds means that, even though it is higher risk than perhaps even some of the category 1 companies, it will not be subject to the requirements to pick up the particular issues raised by the noble Baroness and the noble Lord, and their concerns for those issues, which are clearly social harms, will not really be considered on a par.
In the response I gave, I said that we are making the risk assessment that the riskiest behaviour is illegal content and content which presents a harm to children. That is the assessment and the approach taken in the Bill. In relation to other content which is legal and for adults to choose how they encounter it, there are protections in the Bill to enforce terms of service and empower users to curate their own experience online, but that assessment is made by adult users within the law.
I thank all noble Lords who spoke in this short but important debate. As we heard, some issues relating to risk and harm have been returned to and will no doubt be again, and we note the impact of the absence of legal but harmful as a concept. As the noble Baroness, Lady Bull, said, I know that the noble Baroness, Lady Parminter, was very sad that she could not be here this afternoon due to another engagement.
I will not keep the House much longer. I particularly noted the noble Baroness’s point that there should not be, and is not, a direct relationship between the size of the platform and its ability to cause harm. There is a balance to be struck between the regulatory burden placed on platforms versus the health and well-being of those who are using them. As I have said before, I am not sure that we have always got that particular balance right in the Bill.
The noble Lord, Lord Allan, was very constructive: it has to be a good thing if we are now beginning to think about the Bill’s implementation, although we have not quite reached the end and I do not want to prejudge any further stages, in the sense that we are now thinking about how this would work. Of course, he is right to say that some of these platforms have no intention of complying with these rules at all. Ofcom and the Government will have to work out what to do about that.
Ultimately, the Government of the day—whoever it might be—will want the powers to be able to say that a small platform is deeply harmful in terms of its content and reach. When the Bill has been passed, there will be pressure at some point in the future on a platform that is broadcasting or distributing or amplifying content that is deeply harmful. Although I will withdraw the amendment today, my noble friend’s offer of further conversations, and more detail on categorisation and of any review of the platforms as categorised as category 1, 2 and beyond, would be very helpful in due course. I beg leave to withdraw.
Amendment 192 withdrawn.
Amendments 192A and 193 not moved.
Schedule 11 agreed.
Clause 86 agreed.
Amendment 194 not moved.
Clauses 87 and 88 agreed.
Clause 89: OFCOM’s register of risks, and risk profiles, of Part 3 services
Amendments 194A to 197 not moved.
Clause 89 agreed.
Clause 90 agreed.
House resumed.