Committee (7th Day)
Relevant document: 28th Report from the Delegated Powers Committee
56: After Clause 17, insert the following new Clause—
“OFCOM reviews of complaints systems
(1) Within the period of one year beginning on the day on which this Act is passed, and annually thereafter, OFCOM must review the workings of the complaints systems set up by regulated companies under section 17 (duties about complaints procedures), as to—(a) their effectiveness;(b) their cost and efficiency; and(c) such other matters as seem appropriate.(2) In undertaking the reviews under subsection (1), OFCOM may take evidence from such bodies and individuals as it considers appropriate.(3) If OFCOM determines from the nature of the complaints being addressed, and the volumes of such complaints, that systems established under section 17 are not functioning as intended, it may establish an online safety ombudsman with the features outlined in subsections (4) to (8), with the costs of this service being met from the levy on regulated companies.(4) The purpose of the online safety ombudsman is to provide an impartial out-of-court procedure for the resolution of any dispute between—(a) a user of a regulated user-to-user service, or a nominated representative for that user, and(b) the regulated service provider,in cases where complaints made under processes which are compliant with section 17 have not, in the view of the user (or their representative), been adequately addressed.(5) The ombudsman must allow for a user (or their representative) who is a party to such a dispute to refer their case to the ombudsman if they are of the view that any feature or conduct of one or more provider of a regulated user-to-user service, which is relevant to that dispute, presents (or has presented) a material risk of—(a) significant or potential harm;(b) contravening a user’s rights, as set out in the Human Rights Act 1998, including freedom of expression; or(c) failure to uphold terms of service.(6) The ombudsman may make special provision for children, including (but not limited to) prioritisation of—(a) relevant provisions under the United Nations Convention on the Rights of the Child; or(b) a child’s physical, emotional or psychological state.(7) The ombudsman must have regard to the desirability of any dispute resolution service provided by the ombudsman being— (a) free;(b) easy to use, including (where relevant) taking into account the needs of vulnerable users and children;(c) effective and timely;(d) fair and flexible, taking into account different forms of technology and the unique needs of different types of user; and(e) transparent.(8) The Secretary of State must ensure that use of any dispute resolution service provided by the ombudsman does not affect the ability of a user (or their representative) to bring a claim in civil proceedings.”Member’s explanatory statement
This new Clause would require Ofcom to conduct regular reviews of the effectiveness of complaints procedures under Clause 17. If Ofcom were of the view that such procedures were not functioning effectively, they would be able to establish an online safety ombudsman with the features outlined in subsections (4) to (8) of the Clause.
My Lords, Amendment 56 proposes a pathway towards setting up an independent ombudsman for the social media space. It is in my name, and I am grateful to the noble Lord, Lord Clement-Jones, for his support. For reasons I will go into, my amendment is a rather transparent and blatant attempt to bridge a gap with the Government, who have a sceptical position on this issue, and I hope that the amendment in its present form will prove more attractive to them than our original proposal.
At the same time, the noble Baroness, Lady Newlove, has tabled an amendment on this issue, proposing an independent appeals mechanism
“to provide impartial out of court resolutions for individual users of regulated services”.
Given that this is almost exactly what I want to see in place—as was set out in my original amendment, which was subsequently rubbished by the Government—I have also signed the noble Baroness’s amendment, and I very much look forward to her speech. The Government have a choice.
The noble Baroness, Lady Fox, also has amendments in this group, although they are pointing in a slightly different direction. I will not speak to them at this point in the proceedings, although I make it absolutely clear that, while I look forward to hearing her arguments —she is always very persuasive—I support the Bill’s current proposals on super-complaints.
Returning to the question of why we think the Bill should make provision for an independent complaints system or ombudsman, I suppose that, logically, we ought first to hear the noble Baroness, Lady Newlove, then listen to the Government’s response, which presumably will be negative. My compromise amendment could then be considered and, I hope, win the day with support from all around the Committee—in my dreams.
We have heard the Government’s arguments already. As the Minister said in his introduction to the Second Reading debate all those months ago on 1 February 2023, he was unsympathetic. At that time, he said:
“Ombudsman services in other sectors are expensive, often underused and primarily relate to complaints which result in financial compensation. We find it difficult to envisage how an ombudsman service could function in this area, where user complaints are likely to be complex and, in many cases, do not have the impetus of financial compensation behind them”.—[Official Report, 1/2/23; col. 690.]
Talk about getting your retaliation in first.
My proposal is based on the Joint Committee’s unanimous recommendation:
“The role of the Online Safety Ombudsman should be created to consider complaints about actions by higher risk service providers where either moderation or failure to address risks leads to … demonstrable harm (including to freedom of expression) and recourse to other routes of redress have not resulted in a resolution”.
The report goes on to say that there could
“be an option in the Bill to extend the remit of the Ombudsman to lower risk providers. In addition … the Ombudsman would as part of its role i) identify issues in individual companies and make recommendations to improve their complaint handling and ii) identify systemic industry wide issues and make recommendations on regulatory action needed to remedy them. The Ombudsman should have a duty to gather data and information and report it to Ofcom. It should be an ‘eligible entity’ to make super-complaints”
possible. It is a very complicated proposal. Noble Lords will understand from the way the proposal is framed that it would provide a back-up to the primary purpose of complaints, which must be to the individual company and the service it is providing. But it would be based on a way of learning from experience, which it would build up as time went on.
I am sure that the noble Lord, Lord Clement-Jones, will flesh out the Joint Committee’s thinking on this issue when he comes to speak, but I make the point that other countries preparing legislation on online safety are in fact building in independent complaints systems; we are an outlier on this. Australia, Canada and others have already legislated. Another very good example nearer to hand is in Ireland. We are very lucky to have with us today the noble Baroness, Lady Kidron, a member of the expert panel whose advice to the Irish Government to set up such a system in her excellent report in May 2022 has now been implemented. I hope that she will share her thoughts about these amendments later in the debate.
Returning to the Government’s reservations about including an ombudsman service in the Bill, I make the following points based on my proposals in Amendment 56. There need not be any immediate action. The amendment as currently specified requires Ofcom to review complaints systems set up by the companies under Clause 17 as to their effectiveness and efficiency. It asks Ofcom to take other evidence into account and then, and only then, to take the decision of whether to set up an ombudsman system. If there were no evidence of a need for such a service, it would not happen.
As for the other reservations raised by the Minister when he spoke at Second Reading, he said:
“Ombudsman services in other sectors are expensive”.
We agree, but we assume that this would be on a cost recovery model, as other Ofcom services are funded in that way. The primary focus will always be resolving complaints about actions or inactions of particular companies in the companies’ own redress systems, and Ofcom can always keep that under review.
He said that they are “often underused”. Since we do not know at the start what the overall burden will be, we think that the right solution is to build up slowly and let Ofcom decide. There are other reasons why it makes sense to prepare for such a service, and I will come to these in a minute.
He said that other ombudsman services
“primarily relate to complaints which result in financial compensation”.
That is true, but the evidence from other reports, and that we received in the Joint Committee, was that most complainants want non-financial solutions: they want egregious material taken down or to ensure that certain materials are not seen. They are not after the money. Where a company is failing to deliver on those issues in their own complaints system, to deny genuine complainants an appeal to an independent body seems perverse and not in accordance with natural justice.
He said that
“user complaints are likely to be complex”.—[Official Report, 1/2/23; col. 690.]
Yes, they probably are, but that seems to be an argument for an independent appeals body, not against it.
To conclude, we agree that Ofcom should not be the ombudsman and that the right approach is for Ofcom to set up the system as and when it judges that it would be appropriate. We do not want Ofcom to be swamped with complaints from users of regulated services, who, for whatever reason, have not been satisfied by the response of the individual companies or to complex cases, or seek system-wide solutions. But Ofcom needs to know what is happening on the ground, across the sector, as well as in each of the regulated companies, and it needs to be kept aware of how the system as a whole is performing. The relationship between the FCA and the Financial Ombudsman Service is a good model here. Indeed, the fact that some of the responsibilities to be given to Ofcom in the Bill will give rise to complaints to the FOS suggests that there would be good sense in aligning these services right from the start.
We understand that the experience from Australia is that the existence of an independent complaints function can strengthen the regulatory functions. There is also evidence that the very existence of an independent complaints mechanism can provide reassurances to users that their online safety is being properly supported. I beg to move.
My Lords, this is the first time that I have spoken in Committee. I know we have 10 days, but it seems that we will go even further because this is so important. I will speak to Amendments 250A and 250B.
I thank the noble Lords, Lord Russell of Liverpool and Lord Stevenson of Balmacara, and, of course— if I may be permitted to say so—the amazing noble Baroness, Lady Kidron, who is an absolute whizz on this, for placing their names on these amendments, as well as the 5Rights Foundation, the Internet Watch Foundation and the UK Safer Internet Centre for their excellent briefings. I have spoken to these charities, and the work they do is truly amazing. I do not think that the Bill will recognise just how much time and energy they give to support families and individuals. Put quite simply, we can agree that services’ internal complaint mechanisms are failing.
Let me tell your Lordships about Harry. Harry is an autistic teenager who was filmed by a member of the public in a local fast-food establishment when he was dysregulated and engaging in aggressive behaviour. This footage was shared out of context across social media, with much of the response online labelling Harry as a disruptive teenager who was engaging in unacceptable aggression and vandalising public property. This was shared thousands of times over the course of a few weeks. When Harry and his mum reported it to the social media platforms, they were informed that it did not violate community guidelines and that there was a public interest in the footage remaining online. The family, quite rightly, felt powerless. Harry became overwhelmed at the negative response to the footage and the comments made about his behaviour. He became withdrawn and stopped engaging. He then tried to take his own life.
It was at this point that Harry’s mum reached out to the voluntary-run service Report Harmful Content, as she had nowhere else to turn. Report Harmful Content is run by the charity South West Grid for Learning. It was able to mediate between the social media sites involved to further explain the context and demonstrate the real-world harm that this footage, by remaining online, was having on the family and on Harry’s mental health. Only then did the social media companies concerned remove the content.
Sadly, Harry’s story is not an exception. In 2022, where a platform initially refused to take down content, Report Harmful Content successfully arbitrated the removal of content in 87% of cases, thus demonstrating that even if the content did not violate community guidelines, it was clear that harm had been done. There are countless cases of members of the public reporting a failure to remove content that was bullying them. This culture of inaction has led to apathy and a disbelief among users that their appeals will ever be redressed. Research published by the Children’s Commissioner for England found that 40% of children did not report harmful content because they felt that there
“was no point in doing so”.
The complaints mechanism in the video-sharing platform regulation regime is being repealed without an alternative mechanism to fill the gap. The current video-sharing platform regulation requires platforms to
“provide for an impartial out-of-court procedure for the resolution of any dispute between a person using the service and the provider”
to operate impartial dispute resolution in the event. In its review of the first year of this regulation, Ofcom highlighted that the requirements imposed on platforms in scope are not being met in full currently. However, instead of strengthening existing appeals processes, the VSP regime is set to be repealed and superseded by this Bill.
The Online Safety Bill does not have an individual appeals process, meaning that individuals will be left without an adequate pathway to redress. The Bill establishes only a “super-complaints” process for issues concerning multiple cases or cases highlighting a systemic risk. It will ultimately fall to the third sector to highlight cases to Ofcom on behalf of individuals.
The removal of an appeals process—given the repeal of the VSP regime—would be in stark contrast with the direction of travel in other nations. Independent appeals processes exist in Australia and New Zealand, and more countries are also looking at adopting independent appeals. The new Irish Online Safety and Media Regulation Act includes provision
“for the making of a complaint to the Commission”.
The Digital Services Act in Europe also puts a process in place. There is precedent for these systems. It cannot be right that the Republic of Ireland and the UK and its territories have over 52 ombudsmen in 32 sectors, yet none of them works in digital at a time when online harm—especially to children, as we hear time and again in your Lordships’ House—is at unprecedented levels.
The Government’s response so far has been insufficient. When the Online Safety Bill received its seventh sitting debate, much discussion related to independent appeals, referred to here as the need for an ombudsman. The Digital Minister recognised:
“In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast”.
Dame Maria Miller MP said that
“it is not a good argument to say that this is such an enormous problem that we cannot have a process in place to deal with it”.—[Official Report, Commons, Online Safety Bill Committee, 9/6/22; col. 295-96.]
In response to the Joint Committee’s recommendation for an ombudsman, the Government said:
“An independent resolution mechanism such as an Ombudsman is relatively untested in areas of non-financial harm. Therefore, it is difficult to know how an Ombudsman service could function where user complaints are likely to be complex and where financial compensation is not usually appropriate. An Ombudsman service may also disincentivise services from taking responsibility for their users’ safety. Introducing an independent resolution mechanism at the same time as the new regime may also pose a disproportionate regulatory burden for services and confuse users … The Secretary of State will be able to reconsider whether independent resolution mechanisms are appropriate at the statutory review. Users will also already have a right of action in court if their content is removed by a service provider in breach of the terms and conditions. We will be requiring services to specifically state this right of action clearly in their terms and conditions”.
Delaying the implementation of an individual appeals process will simply increase the backlog of cases and will allow for the ripple effect of harm to go unreported, unaddressed and unaccounted for.
There is precedent for individual complaints systems, as I have mentioned, both in the UK and abroad. Frankly, the idea that an individual complaints process will disincentivise companies from taking responsibility does not hold weight, given that these companies’ current appeal mechanisms are woefully inadequate. Users must not be left to the courts to have their appeals addressed. This process is cost-prohibitive for most and cannot be the only pathway to justice for victims, especially children.
To conclude, I have always personally vowed to speak up for those who endure horrific suffering and injustices from tormentors. I know how the pain and trauma that comes from systems that have been set up being no longer being fit for purpose feels. I therefore say this to my noble friend the Minister: nothing is too difficult if you really want to find a solution. The public have asked for this measure and there is certainly wide precedent for it. By not allowing individuals an appeals process, the Government’s silence simply encourages the tormentors and leaves the tormented alone.
My Lords, I will speak in support of Amendments 250A and 250B; I am not in favour of Amendment 56, which is the compromise amendment. I thank the noble Baroness, Lady Newlove, for setting out the reasons for her amendments in such a graphic form. I declare an interest as a member of the Expert Group on an Individual Complaints Mechanism for the Government of Ireland.
The day a child or parent in the UK has a problem with an online service and realises that they have nowhere to turn is the day that the online safety regime will be judged to have failed in the eyes of the public. Independent redress is a key plank of any regulatory system. Ombudsmen and independent complaint systems are available across all sectors, from finance and health to utilities and beyond. As the noble Lord, Lord Stevenson, set out, they are part of all the tech regulation that has been, or is in the process of being, introduced around the world.
I apologise in advance if the Minister is minded to agree to the amendment, but given that, so far, the Government have conceded to a single word in a full six days in Committee, I dare to anticipate that that is not the case and suggest three things that he may say against the amendment: first, that any complaints system will be overwhelmed; secondly, that it will offer a get-out clause for companies from putting their own robust systems in place; and, thirdly, that it will be too expensive.
The expert group of which I was a member looked very carefully at each of these questions and, after taking evidence from all around the globe, it concluded that the system need not be overwhelmed if it had the power to set clear priorities. In the case of Ireland, those priorities were complaints that might result in real-world violence and complaints from or on behalf of children. The expert group also determined that the individual complaints system should be
“afforded the discretion to handle and conclude complaints in the manner it deems most appropriate and is not unduly compelled toward or statutorily proscribed to certain courses of action in the Bill”.
For example, there was a lot of discussion on whether it could decide not to deal with copycat letters, treat multiple complaints on the same or similar issue as one, and so on.
Also, from evidence submitted during our deliberations, it became clear that many complainants have little idea of the law and that many complaints should be referred to other authorities, so among the accepted recommendations was that the individual complaints system should be
“provided with a robust legal basis for transferring or copying complaints to other bodies as part of the triage process”—
for example, to the data regulator, police, social services and other public bodies. The expert group concluded that this would actually result in better enforcement and compliance in the ecosystem overall.
On the point that the individual complaints mechanism may have the unintended consequence of making regulated services lazy, the expert group—which, incidentally, comprised a broad group of specialisms such as ombudsmen, regulators and legal counsel among others—concluded that it was important for the regulator to set a stringent report and redress code of practice for regulated companies so that it was not possible for any company to just sit back until people were so fed up that they went to the complaints body. The expert group specifically said in its report that it
“is acutely aware of the risk of … the Media Commission … drawing criticism for the failings of the regulated entities to adequately comply with systemic rules. In this regard, an individual complaints mechanism should not be viewed as a replacement for the online platforms’ complaint handling processes”.
Indeed, the group felt that an individual complaints system complemented the powers given to the regulator, which could and should take enforcement against those companies that persistently fail to introduce an adequate complaints system—not least because the flow of complaints would act as an early warning system of emerging harms, which is of course one of the regulator’s duties under the Bill.
When replying to a question from the noble Lord, Lord Knight of Weymouth, last week about funding digital literacy, the Minister made it clear that the online safety regime would be self-financing via the levy. In which case, it does not seem to be out of proportion to have a focused and lean system in which the urgent, the vulnerable and the poorly served have somewhere to turn.
The expert group’s recommendation was accepted in full by Ireland’s Minister for Media, Culture and Tourism, Catherine Martin, who said she would
“always take the side of the most vulnerable”
and the complaint system would deal with people who had
“exhausted the complaints handling procedures by any online services”.
I have had the pleasure of talking to its new leadership in recent weeks, and it is expected to be open for business in 2024.
I set that out at length just to prove that it is possible. It was one of the strong recommendations of the pre-legislative committee, and had considerable support in the other place, as we have heard. I think both Ofcom and DSIT should be aware that many media outlets have not yet clocked that this complicated Bill is so insular that the users of tech have no place to go and no voice.
While the Bill can be pushed through without a complaints system, this leaves it vulnerable. It takes only one incident or a sudden copycat rush of horrors, which have been ignored or trivialised by the sector with complainants finding themselves with nowhere to go but the press, to undermine confidence in the whole regulatory edifice.
So I have three questions for the Minister. The first two are on the VSP regime which, as was set out by the noble Baroness, Lady Newlove, is being cancelled by the Bill. First, could the Minister confirm to the Committee that the VSP complaints system has done nothing useful since it was put in place? Therefore, was the decision to repeal it based on its redundancy?
Secondly, if the system has indeed been deemed redundant, is that because of a failure of capacity or implementation by Ofcom—this is crucial for the Committee to understand, as Ofcom is about to take on the huge burden of this Bill—or is it because all the companies within the regime are now entirely compliant?
Thirdly, once a child or parent has exhausted a company’s complaints system, where, under the Bill in front of us, do the Government think they should go?
I have not yet heard from the noble Baroness, Lady Fox, on her amendments, so I reserve the right to violently agree with her later, but I simply do not understand her reasoning for scrapping super-complaints from the Bill. Over the last six days in Committee, the noble Baroness has repeatedly argued that your Lordships must be wary about putting too much power in the hands of the Government or the tech sector, yet here we have a mechanism that allows users a route to justice that does not depend on their individual wealth. Only those with deep pockets and the skin of a rhinoceros can turn to the law as individuals. A super-complaints system allows interested parties, whether from the kids sector or the Free Speech Union, to act on behalf of a group. As I hope I have made clear, this is additional to, not instead of, an individual complaints system, and I very much hoped to have the noble Baroness’s support for both.
My Lords, I also put my name to Amendments 250A and 250B, but the noble Baronesses, Lady Newlove and Lady Kidron, have done such a good job that I shall be very brief.
To understand the position that I suspect the Government may put forward, I suggest one looks at Commons Hansard and the discussion of this in the seventh Committee sitting of 9 June last year. To read it is to descend into an Alice in Wonderland vortex of government contradictions. The then Digital Minister—a certain Chris Philp, who, having been so effective as Digital Minister, was promoted, poor fellow, to become a Minister in the Home Office; I do not know what he did to deserve that—in essence said that, on the one hand, this problem is absolutely vast, and we all recognise that. When responding to the various Members of the Committee who put forward the case for an independent appeals mechanism, he said that the reason we cannot have one is that the problem is too big. So we recognise that the problem is very big, but we cannot actually do anything about it, because it is too big.
I got really worried because he later did something that I would advise no Minister in the current Government ever to do in public. Basically, he said that this
“groundbreaking and world-leading legislation”—[Official Report, Commons, Online Safety Bill Committee, 9/6/22; col. 296.]
will fix this. If I ruled the world, if any Minister in the current Government said anything like that, they would immediately lose the Whip. The track record of people standing up and proudly boasting how wonderful everything is going to be, compared to the evidence of what actually happens, is not a track record of which to be particularly proud.
I witnessed, as I am sure others did, the experience of the noble Baroness, Lady Kidron, pulling together a group of bereaved parents: families who had lost their child through events brought about by the online world. A point that has stayed with me from that discussion was the noble Baroness, Lady Kidron, who was not complaining, saying at the end that there is something desperately wrong with the system where she ends up as the point person to try to help these people resolve their enormous difficulties with these huge companies. I remind noble Lords that the family of Molly Russell, aided by a very effective lawyer, took no less than five years to get Meta to actually come up with what she was looking at online. So the most effective complaints process, or ombudsman, was the fact they were able to have a very able lawyer and an exceptionally able advocate in the shape of the noble Baroness, Lady Kidron, helping in any way she could. That is completely inadequate.
I looked at the one of the platforms that currently helps individual users—parents—trying to resolve some of the complaints they have with companies. It is incredibly complicated. So relying on the platforms themselves to bring forward, under the terms of the Bill, completely detailed systems and processes to ensure that these things do not happen, or that if there is a complaint it will be followed up dutifully and quickly, does not exactly fill me with confidence, based on their previous form.
For example, as a parent or an individual, here are some of the questions you might have to ask yourself. How do I report violence or domestic abuse online? How do I deal with eating disorder content on social media? How do I know what is graphic content that does not breach terms? How do I deal with abuse online? What do I do as a UK citizen if I live outside the UK? It is a hideously complex world out there. On the one hand, bringing in regulations to ensure that the platforms do what they are meant to, and on the other hand charging Ofcom to act as the policeman to make sure that they are actually doing it, is heaping yet more responsibility on Ofcom. The noble Lord, Lord Grade, is showing enormous stamina sitting up in the corner; he is sitting where the noble Lord, Lord Young, usually sits, which is a good way of giving the Committee a good impression.
What I would argue to the Minister is that to charge Ofcom with doing too much leads us into dangerous territory. The benefit of having a proper ombudsman who deals with these sorts of complaints week in, week out, is exactly the same argument as if one was going to have a hip or a knee replacement. Would you rather have it done by a surgical team that does it once a year or one that does it several hundred times a year? I do not know about noble Lords, but I would prefer the latter. If we had an effective ombudsman service that dealt with these platforms day in, day out, they would be the most effective individuals to identify whether or not those companies were actually doing what they are meant to do in the law, because they would be dealing with them day in, day out, and would see how they were responding. They could then liaise with Ofcom in real time to tell it if some platforms were not performing as they should. I feel that that would be more effective.
The only question I have for the Minister is whether he would please agree to meet with us between now and Report to really go into this in more detail, because this is an open goal which the Government really should be doing something to try to block. It is a bit of a no-brainer.
My Lords, I see my amendments as being probing. I am very keen on having a robust complaints system, including for individuals, and am open to the argument about an ombudsman. I am listening very carefully to the way that that has been framed. I tabled these amendments because while I know we need a robust complaints system—and I think that Ofcom might have a role in that—I would want that complaints system to be simple and as straightforward as possible. We certainly need somewhere that you can complain.
Ofcom will arguably be the most powerful regulator in the UK, effectively in charge of policing a key element of democracy: the online public square. Of course, one question one might ask is: how do you complain about Ofcom in the middle of it all? Ironically, an ombudsman might be somewhere where you would have to field more than just complaints about the tech companies.
I have suggested completely removing Clauses 150 to 152 from the Bill because of my reservations, beyond this Bill and in general, about a super-complaints system within the regulatory framework, which could be very unhelpful. I might be wrong, and I am open to correction if I have misunderstood, but the Bill’s notion of an eligible entity who will be allowed to make this complaint to Ofcom seems, at the moment, to be appointed on criteria set only by the Secretary of State. That is a serious problem. There is a danger that the Secretary of State could be accused of partiality or politicisation. We therefore have to think carefully about that.
I also object to the idea that certain organisations are anointed with extra legitimacy as super-complaints bodies. We have seen this more broadly. You will often hear Ministers say, in relation to consultations, “We’ve consulted stakeholders and civil society organisations”, when they are actually often referring to lobbying organisations with interests. There is a free-for-all for NGOs and interest groups. We think of a lot of charities as very positive but they are not necessarily neutral. I just wanted to query that.
There is also a danger that organisations will end up speaking on behalf of all women, all children or all Muslims. That is something we need to be careful about in a time of identity politics. We have seen it happen offline with self-appointed community leaders, but say, for example, there is a situation where there is a demand by a super-complainant to remove a particular piece of content that is considered to be harmful, such as an image of the Prophet Muhammad. These are areas where we have to admit that if people then say, “We speak on behalf of”, they will cause problems.
Although charities historically have had huge credibility, as I said, we know from some of the scandals that have affected charities recently that they are not always the saviours. They are certainly not immune from corruption, political bias, political disputes and so on.
I suppose my biggest concern is that the function in the Bill is not open to all members of the public. That seems to be a problem. Therefore, we are saying that certain groups and individuals will have a greater degree of influence over the permissibility of speech than others. There are some people who have understood these clauses to mean that it would act like a class action—that if enough people are complaining, it must be a problem. But, as noble Lords will know from their inboxes, sometimes one is inundated with emails and it does not necessarily show a righteous cause. I do not know about anyone else who has been involved in this Bill, but I have had exactly the same cut-and-paste email about violence against women and girls hundreds of times. That usually means a well-organised, sometimes well-funded, mass mobilisation. I have no objection, but just because you get lots of emails it does not mean that it is a good complaint. If you get only one important email complaint that is written by an individual, surely you should respect that minority view.
Is it not interesting that the assumption of speakers so far has been that the complaints will always be that harms have not been removed or taken notice of? I was grateful when the noble Baroness, Lady Kidron, mentioned the Free Speech Union and recognised, as I envisage, that many of the complaints will be about content having been removed—they will be free speech complaints. Often, in that instance, it will be an individual whose content has been removed. I cannot see how the phrasing of the Bill helps us in that. Although I am a great supporter of the Free Speech Union, I do not want it to represent or act on behalf of, say, Index on Censorship or even an individual who simply thinks that their content should not be removed—and who is no less valid than an official organisation, however much I admire it.
I certainly do not want individual voices to be marginalised, which I fear the Bill presently does in relation to complaints. I am not sure about an ombudsman; I am always wary of creating yet another more powerful body in the land because of the danger of over-bureaucratisation.
My Lords, I had to miss a few sessions of the Committee but I am now back until the end. I remind fellow Members of my interests: I worked for one of the largest platforms for a decade, but I have no current interests. It is all in the register if people care to look. I want to contribute to this debate on the basis of that experience of having worked inside the platforms.
I start by agreeing with the noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, and my noble friend Lord Clement-Jones. The thrust of their amendments—the idea that something will be needed here—is entirely correct. We have created in the Online Safety Bill a mechanism that we in this Committee know is intended primarily to focus on systems and how Ofcom regulates them, but what the public out there hear is that we are creating a mechanism that will meet their concerns—and their concerns will not end with systems. As the noble Baroness, Lady Newlove, eloquently described, their concerns in some instances will be about specific cases and the question will be: who will take those up?
If there is no other mechanism and no way to signpost people to a place where they can seek redress, they will come to Ofcom. That is something we do not want. We want Ofcom to be focused on the big-ticket items of dealing with systems, not bogged down in dealing with thousands of individual complaints. So we can anticipate a situation in which we will need someone to be able to deal with those individual complaints.
I want to focus on making that workable, because the volume challenge might not be as people expect. I have seen from having worked on the inside that there is a vast funnel of reports, where people report content to platforms. Most of those reports are spurious or vexatious; that is the reality. Platforms have made their reporting systems easy, as we want them to do —indeed, in the Bill we say, “Make sure you have really easy-to-use reporting systems”—but one feature of that is that people will use them simply to express a view. Over the last couple of weeks, all the platforms will have been inundated with literally millions of reports about Turkish politicians. These will come from the supporters of either side, reporting people on the other side—claiming that they are engaged in hate speech or pornography or whatever. They will use whatever tool they can. That is what we used to see day in, day out: football teams or political groups that report each other. The challenge is to separate out the signal—the genuinely serious reports of where something is going wrong—from the vast amount of noise, of people simply using the reporting system because they can. For the ombudsman, the challenge will be that signal question.
Breaking that down, from the vast funnel of complaints coming in, we have a smaller subset that are actionable. Some of those will be substantive, real complaints, where the individual simply disagrees with the decision. That could be primarily for two reasons. The first is that the platform has made a bad decision and failed to enforce its own policies. For example, you reported something as being pornographic, and it obviously was, but the operator was having a bad day—they were tired, it was late in the day and they pressed “Leave up” instead of “Take down”. That happens on a regular basis, and 1% of errors like that across a huge volume means a lot of mistakes being made. Those kinds of issues, where there is a simple operator error, should get picked up by the platforms’ own appeal mechanisms. That is what they are there for, and the Bill rightly points to that. A second reviewer should look at it. Hopefully they are a bit fresher, understand that a mistake was made and can simply reverse it. Those operator error reports can be dealt with internally.
The second type would be where the platform enforces policies correctly but, from the complainant’s point of view, the policies are wrong. It may be a more pro-free speech platform where the person says, “This is hate speech”, but the platform says, “Well, according to our rules, it is not. Under our terms of service, we permit robust speech of this kind. Another platform might not, but we do”. In that case, the complainant is still unhappy but the platform has done nothing wrong—unless the policies the platform is enforcing are out of step with the requirements under the Online Safety Bill, in which case the complaint should properly come to Ofcom. Based on the individual complaint, a complainant may have something material for Ofcom. They are saying that they believe the platform’s policies and systems are not in line with the guidance issued by Ofcom—whether on hate speech, pornography or anything else. That second category of complaint would come to Ofcom.
The third class concerns the kind of complaint that the noble Baroness, Lady Newlove, described. In some ways, this is the hardest. The platform has correctly enforced its policies but, in a particular case, the effect is deeply unfair, problematic and harmful for an individual. The platform simply says, “Look, we enforced the policies. They are there. This piece of content did not violate them”. Any outsider looking at it would say, “There is an injustice here. We can clearly see that an individual is being harmed. A similar piece of content might not be harmful to another individual, but to this individual it is”. In those circumstances, groups such as the South West Grid for Learning, with which I work frequently, perform an invaluable task. We should recognise that there is a network of non-governmental organisations in the United Kingdom that do this day in, day out. Groups such as the Internet Watch Foundation and many others have fantastic relations and connections with the platforms and regularly bring exceptional cases to them.
It has let me know as well. In a way, the amendment seeks to formalise what is already an informal mechanism. I was minded initially to support Amendment 56 in the name of my noble friend Lord Clement-Jones and the noble Lord, Lord Stevenson.
This landscape is quite varied. We have to create some kind of outlet, as the noble Baroness, Lady Kidron, rightly said. That parent or individual will want to go somewhere, so we have to send them somewhere. We want that somewhere to be effective, not to get bogged down in spurious and vexatious complaints. We want it to have a high signal-to-noise ratio—to pull out the important complaints and get them to the platforms. That will vary from platform to platform. In some ways, we want to empower Ofcom to look at what is and is not working and to be able to say, “Platform A has built up an incredible set of mechanisms. It’s doing a good job. We’re not seeing things falling through the cracks in the same way as we are seeing with platform B. We are going to have to be more directive with platform B”. That very much depends on the information coming in and on how well the platforms are doing their job already.
I hope that the Government are thinking about how these individual complaints will be dealt with and about the demand that will be created by the Bill. How can we have effective mechanisms for people in the United Kingdom who genuinely have hard cases and have tried, but where there is no intermediary for the platform they are worried about? In many cases, I suspect that these will be newer or smaller platforms that have arrived on the scene and do not have established relationships. Where are these people to go? Who will help them, particularly in cases where the platform may not systemically be doing anything wrong? Its policies are correct and it is enforcing them correctly, but any jury of peers would say that an injustice is being done. Either an exception needs to be made or there needs to be a second look at that specific case. We are not asking Ofcom to do this in the rest of the legislation.
My Lords, it is always somewhat intimidating to follow the noble Lord, Lord Allan, though it is wonderful to have him back from his travels. I too will speak in favour of Amendments 250A and 250B in the name of my noble friend, from not direct experience in the social media world but tangentially, from telecoms regulation.
I have lived, as the chief executive of a business, in a world where my customers could complain to me but also to an ombudsman and to Ofcom. I say this with some hesitation, as my dear old friends at TalkTalk will be horrified to hear me quoting this example, but 13 years ago, when I took over as chief executive, TalkTalk accounted for more complaints to Ofcom than pretty much all the other telcos put altogether. We were not trying to be bad—quite the opposite, actually. We were a business born out of very rapid growth, both organic and acquisitive, and we did not have control of our business at the time. We had an internal complaints process and were trying our hardest to listen to it and to individual customers who were telling us that we were letting them down, but we were not doing that very well.
While my noble friend has spoken so eloquently about the importance of complaints mechanisms for individual citizens, I am actually in favour of them for companies. I felt the consequences of having an independent complaints system that made my business listen. It was a genuine failsafe system. For someone to have got as far as complaining to the telecoms ombudsman and to Ofcom, they had really lost the will to live with my own business. That forced my company to change. It has forced telecoms companies to change so much that they now advertise where they stand in the rankings of complaints per thousand customers. Even in the course of the last week, Sky was proclaiming in its print advertising that it was the least complained-about to the independent complaints mechanism.
So this is not about thinking that companies are bad and are trying to let their customers down. As the noble Lord, Lord Allan, has described, managing these processes is really hard and you really need the third line of defence of an independent complaints mechanism to help you deliver on your best intentions. I think most companies with very large customer bases are trying to meet those customers’ needs.
For very practical reasons, I have experienced the power of these sorts of systems. There is one difference with the example I have given of telecoms: it was Ofcom itself that received most of those complaints about TalkTalk 13 years ago, and I have tremendous sympathy with the idea that we might unleash on poor Ofcom all the social media complaints that are not currently being resolved by the companies. That is exactly why, as Dame Maria Miller said, we need to set up an independent ombudsman to deal with this issue.
From a very different perspective from that of my noble friend, I struggle to understand why the Government do not want to do what they have just announced they want to do in other sectors such as gambling.
My Lords, I had better start by declaring an interest. It is a great pleasure to follow the noble Baroness, Lady Harding, because my interest is directly related to the ombudsman she has just been praising. I am chairman of the board of the Trust Alliance Group, which runs the Energy Ombudsman and the telecoms ombudsman. The former was set up under the Consumers, Estate Agents and Redress Act 2007 and the latter under the Communications Act 2003.
Having got that off my chest, I do not have to boast about the efficacy of ombudsmen; they are an important institution, they take the load off the regulator to a considerable degree and they work closely with the participating companies in the schemes they run. On balance, I would prefer the Consumers, Estate Agents and Redress Act scheme because it involves a single ombudsman, but both those ombudsmen demonstrate the benefit in their sectors.
The noble Lord, Lord Stevenson, pretty much expressed the surprise that we felt when we read the Government’s response to what we thought was a pretty sensible suggestion in the Joint Committee’s report. He quoted it, and I am going to quote it again because it is such an extraordinary statement:
“An independent resolution mechanism such as an Ombudsman is relatively untested in areas of non-financial harm”.
If you look at the ones for which I happen to have some responsibility, and at the other ombudsmen— there is a whole list we could go through: the Legal Ombudsman, the Local Government and Social Care Ombudsman, the Parliamentary and Health Service Ombudsman—there are a number who are absolutely able to take a view on non-financial matters. It is a bit flabbergasting, if that is a parliamentary expression, to come across that kind of statement in a government response.
There has been distilled wisdom during the course of this debate. Although there may be differences of view about whether we have half a loaf or a full loaf, what is clear is that we are all trying to head in the same direction, which is to have an ombudsman for complaints in this sector. We need to keep reminding everybody that this is not a direct complaints system: it is a secondary complaints system, and you have to have exhausted the complaints within the social media platform. My noble friend described some of the complexity of that extremely well, and I thank the noble Baroness, Lady Kidron, for setting out some of the complexities and the views of the expert group.
I mentioned the Government’s response to the Joint Committee, but as the noble Baroness, Lady Newlove, said, we already have an independent appeals system in the video-sharing platform legislation. Why are we going backwards in this Bill? We should be being more comprehensive as a result of this. This Bill is set to dismantle an essential obligation that supports victims of online harm on video-sharing platforms. We have to be more comprehensive, not less. The South West Grid for Learning’s independent appeals process, which has been mentioned already today, highlights that a significant number of responses received by victims of harmful content from industry platforms were initially incorrect, and RHC was able to resolve them.
Some of these misunderstandings are not necessarily complaints that need adjudication; sometimes it is actually miscommunication. We have heard during the debate that other countries are already doing this; several noble Lords have mentioned Ireland, Australia and New Zealand. We need that ability also, and this is where I disagree with the noble Baroness, Lady Fox, although there was a nuance in her argument. It was a probing amendment—how about that? The fact that representative organisations can defend users’ rights for largescale breaches of the law is very important, and I was rather surprised by her criticism of the fact that lobby groups can bring action. Orchestrating complaints is the nature of group or class actions in litigation, so the question is begging to be tested. Those complaints need to be tested, and it is perfectly legitimate for a group to bring those complaints.
The noble Baroness, Lady Newlove, mentioned the research published by the Children’s Commissioner which showed that 40% of children did not report harmful content because they felt there was no point in doing so. That is pretty damning of the current situation. The noble Lord, Lord Russell, and my noble friend made the very strong point that we do not want to bog down the regulator. We see what happens under the data protection legislation. The ICO has an enormous number of complaints to deal with directly, without the benefit of an ombudsman. This scheme could alleviate the burden on the regulator and be highly effective. I do not think we have heard an argument in Committee against this; it must be the way forward. I very much hope that the Minister will take this forward after today and install an ombudsman for the Bill.
My Lords, the amendments in this group are concerned with complaints mechanisms. I turn first to Amendment 56 from the noble Lord, Lord Stevenson of Balmacara, which proposes introducing a requirement on Ofcom to produce an annual review of the effectiveness and efficiency of platforms’ complaints procedures. Were this review to find that regulated services were not complying effectively with their complaints procedure duties, the proposed new clause would provide for Ofcom to establish an ombudsman to provide a dispute resolution service in relation to complaints.
While I am of course sympathetic to the aims of this amendment, the Government remain confident that service providers are best placed to respond to individual user complaints, as they will be able to take appropriate action promptly. This could include removing content, sanctioning offending users, reversing wrongful content removal or changing their systems and processes. Accordingly, the Bill imposes a duty on regulated user-to-user and search services to establish and operate an easy-to-use, accessible and transparent complaints procedure. The complaints procedure must provide for appropriate action to be taken by the provider in relation to the complaint.
It is worth reminding ourselves that this duty is an enforceable requirement. Where a provider is failing to comply with its complaints procedure duties, Ofcom will be able to take enforcement action against the regulated service. Ofcom has a range of enforcement powers, including the power to impose significant penalties and confirmation decisions that can require the provider to take such steps as are required for compliance. In addition, the Bill includes strong super-complaints provisions that will allow for concerns about systemic issues to be raised with the regulator, which will be required to publish its response to the complaint. This process will help to ensure that Ofcom is made aware of issues that users are facing.
Separately, individuals will also be able to submit complaints to Ofcom. Given the likelihood of an overwhelming volume of complaints, as we have heard, Ofcom will not be able to investigate or arbitrate on individual cases. However, those complaints will be an essential part of Ofcom’s horizon-scanning, research, supervision and enforcement activity. They will guide Ofcom in deciding where to focus its attention. Ofcom will also have a statutory duty to conduct consumer research about users’ experiences in relation to regulated services and the handling of complaints made by users to providers of those services. Further, Ofcom can require that category 1, 2A and 2B providers set out in their annual transparency reports the measures taken to comply with their duties in relation to complaints. This will further ensure that Ofcom is aware of any issues facing users in relation to complaints processes.
At the same time, I share the desire expressed to ensure that the complaints mechanisms will be reviewed and assessed. That is why the Bill contains provisions for the Secretary of State to undertake a review of the efficacy of the entire regulatory framework. This will take place between two and five years after the Part 3 provisions come into force, which is a more appropriate interval for the efficacy of the duties around complaints procedures to be reviewed, as it will allow time for the regime to bed in and provide a sufficient evidence base to assess whether changes are needed.
Finally, I note that Amendment 56 assumes that the preferred solution following a review will be an ombudsman. There is probably not enough evidence to suggest that an ombudsman service would be effective for the online safety regime. It is unclear how an ombudsman service would function in support of the new online safety regime, because individual user complaints are likely to be complex and time-sensitive—and indeed, in many cases financial compensation would not be appropriate. So I fear that the noble Lord’s proposed new clause pre-empts the findings of a review with a solution that is resource-intensive and may be unsuitable for this sector.
Amendments 250A and 250B, tabled by my noble friend Lady Newlove, require that an independent appeals system is established and that Ofcom produces guidance to support this system. As I have set out, the Government believe that decisions on user redress and complaints are best dealt with by services. Regulated services will be required to operate an easy-to-use, accessible and transparent complaints procedure that enables users to make complaints. If services do not comply with these duties, Ofcom will be able to utilise its extensive enforcement powers to bring them into compliance.
The Government are not opposed to revisiting the approach to complaints once the regime is up and running. Indeed, the Bill provides for the review of the regulatory framework. However, it is important that the new approach, which will radically change the regulatory landscape by proactively requiring services to have effective systems and processes for complaints, has time to bed in before it is reassessed.
Turning specifically to the points made by my noble friend and by the noble Baroness, Lady Kidron, about the impartial out of court dispute resolution procedure in the VSP, the VSP regime and the Online Safety Bill are not directly comparable. The underlying principles of both regimes are of course the same, with the focus on systems regulation and protections for users, especially children. The key differences are regarding the online safety framework’s increased scope. The Bill covers a wider range of harms and introduces online safety duties on a wider range of platforms. Under the online safety regime, Ofcom will also have a more extensive suite of enforcement powers than under the UK’s VSP regime.
On user redress, the Bill goes further than the VSP regime as it will require services to offer an extensive and effective complaints process and will enable Ofcom to take stronger enforcement action where they fail to meet this requirement. That is why the Government have put the onus of the complaints procedure on the provider and set out a more robust approach which requires all in-scope, regulated user to user and search services to offer an effective complaints process that provides for appropriate action to be taken in relation to the complaint. This will be an enforceable duty and will enable Ofcom to utilise its extensive online safety enforcement powers where services are not complying with their statutory duty to provide a usable, accessible and transparent complaints procedure.
At the same time, we want to ensure that the regime can develop and respond to new challenges. That is why we have included a power for the Secretary of State to review the regulatory framework once it is up and running. This will provide the correct mechanism to assess whether complaint handling mechanisms can be further strengthened once the new regulations have had time to bed in.
The Government are confident that the Online Safety Bill represents a significant step forward in keeping users safe online for these reasons.
My Lords, could I just ask a question? This Bill has been in gestation for about five to six years, during which time the scale of the problems we are talking about has increased exponentially. The Government appear to be suggesting that they will, in three to five years, evaluate whether or not their approach is working effectively.
There was a lot of discussion in this Chamber yesterday about the will of the people and whether the Government were ignoring it. I gently suggest that the very large number of people, who are having all sorts of problems or who are fearful of harm from the online world, will not find in the timescale that the Government are proposing the sort of remedy and speed of action I suspect they were hoping for. Certainly, the rhetoric the Government have used and continue to use at regular points in the Bill when they are slightly on the back foot seems to be designed to try to make the situation seem better than it is.
Will the Minister and the Bill team take on board that there are some very serious concerns that there will be a lot of lashing back at His Majesty’s Government if in three years’ time—which I fear may be the case—we still have a situation where a large body of complaints are not being dealt with? Ofcom is going to suffer from major ombudsman-like constipation trying to deal with this, and the harms will continue. I think I speak for the Committee when I say that the arguments the Minister and the government side are making really do not hold water.
I thought in particular of the direct experience of the noble Baroness, Lady Harding, demonstrating the effect on her company—so substitute platforms for that—of knowing that you are being held to account. Having a system that helps the regulator understand in real time whether or not these companies are doing what they should—they are an early warning system and would know earlier than Ofcom would—just seems sensible. But perhaps being sensible is not what this Bill is about.
I do not know about that last point. I was going to say that I am very happy to meet the noble Lord to discuss it. It seems to me to come down to a matter of timing and the timing of the first review. As I say, I am delighted to meet the noble Lord. By the way, the relevant shortest period is two years not three, as he said.
Following on from my friend, the noble Lord, Lord Russell, can I just say to the Minister that I would really welcome all of us having a meeting? As I am listening to this, I am thinking that three to five years is just horrific for the families. This Bill has gone on for so long to get where we are today. We are losing sight of humanity here and the moral compass of protecting human lives. For whichever Government is in place in three to five years to make the decision to say it does not work is absolutely shameful. Nobody in the Government will be accountable and yet for that family, that single person may commit suicide. We have met the bereaved families, so I say to the Minister that we need to go round the table and look at this again. I do not think it is acceptable to say that there is this timeline, this review, for the Secretary of State when we are dealing with young lives. It is in the public interest to get this Bill correct as it navigates its way back to the House of Commons in a far better state than how it arrived.
I would love the noble Viscount to answer my very specific question about who the Government think families should turn to when they have exhausted the complaints system in the next three to five years. I say that as someone who has witnessed successive Secretaries of State promising families that this Bill would sort this out. Yes?
It is between two and five years. It can be two; it can be five. I am very happy to meet my noble friend and to carry on doing so. The complaints procedure set up for families is to first approach the service provider in an enforceable manner and should the provider fail to meet its enforceable duties to then revert to Ofcom before the courts.
I am sorry but that is exactly the issue at stake. The understanding of the Committee currently is that there is then nowhere to go if they have exhausted that process. I believe that complainants are not entitled to go to Ofcom in the way that the noble Viscount just suggested.
I am happy to meet and discuss this. We are expanding what they are able to receive today under the existing arrangements. I am happy to meet any noble Lords who wish to take this forward to help them understand this—that is probably best.
Amendments 287 and 289 from the noble Baroness, Lady Fox of Buckley, seek to remove the provision for super-complaints from the Bill. The super-complaints mechanism is an important part of the Bill’s overall redress mechanisms. It will enable entities to raise concerns with Ofcom about systemic issues in relation to regulated services, which Ofcom will be required to respond to. This includes concerns about the features of services or the conduct of providers creating a risk of significant harm to users or the public, as well as concerns about significant adverse impacts on the right to freedom of expression.
On who can make super-complaints, any organisation that meets the eligibility criteria set out in secondary legislation will be able to submit a super-complaint to Ofcom. Organisations will be required to submit evidence to Ofcom, setting out how they meet these criteria. Using this evidence, Ofcom will assess organisations against the criteria to ensure that they meet them. The assessment of evidence will be fair and objective, and the criteria will be intentionally strict to ensure that super-complaints focus on systemic issues and that the regulator is not overwhelmed by the number it receives.
To clarify and link up the two parts of this discussion, can the Minister perhaps reflect, when the meeting is being organised, on the fact that the organisations and the basis on which they can complain will be decided by secondary legislation? So we do not know which organisations or what the remit is, and we cannot assess how effective that will be. We know that the super-complainants will not want to overwhelm Ofcom, so things will be bundled into that. Individuals could be excluded from the super-complaints system in the way that I indicated, because super-complaints will not represent everyone, or even minority views; in other words, there is a gap here now. I want that bit gone, but that does not mean that we do not need a robust complaints system. Before Report at least—in the meetings in between—the Government need to advise on how you complain if something goes wrong. At the moment, the British public have no way to complain at all, unless someone sneaks it through in secondary legislation. This is not helpful.
Again, I am just pulling this together—I am curious to understand this. We have been given a specific case—South West Grid for Learning raising a case based on an individual but that had more generic concerns—so could the noble Viscount clarify, now or in writing, whether that is the kind of thing that he imagines would constitute a super-complaint? If South West Grid for Learning went to a platform with a complaint like that—one based on an individual but brought by an organisation—would Ofcom find that complaint admissible under its super-complaints procedure, as imagined in the Bill?
Overall, the super-complaints mechanism is more for groupings of complaints and has a broader range than the individual complaints process, but I will consider that point going forward.
Many UK regulators have successful super-complaints mechanisms which allow them to identify and target emerging issues and effectively utilise resources. Alongside the Bill’s research functions, super-complaints will perform a vital role in ensuring that Ofcom is aware of the issues users are facing, helping them to target resources and to take action against systemic failings.
On the steps required after super-complaints, the regulator will be required to respond publicly to the super-complaint. Issues raised in the super-complaint may lead Ofcom to take steps to mitigate the issues raised in the complaint, where the issues raised can be addressed via the Bill’s duties and powers. In this way, they perform a vital role in Ofcom’s horizon-scanning powers, ensuring that it is aware of issues as they emerge. However, super-complaints are not linked to any specific enforcement process.
My Lords, it has just occurred to me what the answer is to the question, “Where does an individual actually get redress?” The only way they can get redress is by collaborating with another 100 people and raising a super-complaint. Is that the answer under the Bill?
This is getting worse and worse. I am tempted to suggest that we stop talking about this and try to, in a smaller group, bottom out what we are doing. I really think that the Committee deserves a better response on super-complaints than it has just heard.
As I understood it—I am sure that the noble Baroness, Lady Kidron, is about to make the same point—super-complaints are specifically designed to take away the pressure on vulnerable and younger persons to have responsibility only for themselves in bringing forward the complaint that needs to be resolved. They are a way of sharing that responsibility and taking away the pressure. Is the Minister now saying that that is a misunderstanding?
I understand that the Minister has been given a sticky wicket of defending the indefensible. I welcome a meeting, as I think the whole Committee does, but it would be very helpful to hear the Government say that they have chosen to give individuals no recourse under the Bill—that this is the current situation, as it stands, and that there is no concession on the matter. I have been in meetings with people who have been promised such things, so it is really important, from now on in Committee, that we actually state at the Dispatch Box what the situation is. I spent quite a lot of the weekend reading circular arguments, and we now need to get to an understanding of what the situation is. We can then decide, as a Committee, what we do in relation to that.
As I said, I am very happy to hold the meeting. We are giving users greater protection through the Bill, and, as agreed, we can discuss individual routes to recourse.
I hope that, on the basis of what I have said and the future meeting, noble Lords have some reassurance that the Bill’s complaint mechanisms will, eventually, be effective and proportionate, and feel able not to press their amendments.
I am very sorry that I did not realise that the Minister was responding to this group of amendments; I should have welcomed him to his first appearance in Committee. I hope he will come back—although he may have to spend a bit of time in hospital, having received a pass to speak on this issue from his noble friend.
This is a very complicated Bill. The Minister and I have actually talked about that over tea, and he is now learning the hard lessons of what he took as a light badinage before coming to the Chamber today. However, we are in a bit of a mess here. I was genuinely trying to get an amendment that would encourage the department to move forward on this issue, because it is quite clear from the mood around the Committee that something needs to be resolved here. The way the Government are approaching this is by heading towards a brick wall, and I do not think it is the right way forward.
The Minister cannot ignore the evidence from the two very well-respected practitioners who have been involved in this sort of process and understand how it works that this is not the way forward. He has heard somebody who works professionally in this area explain how the system works in practice. He is hearing from individuals who, as we have now discovered, otherwise have nowhere to go. We are being told what seems to be a very confusing story about what the super-complaints system is about and how it will be done. This must be sorted, otherwise he will find that the Great British public, for whom the Bill is designed, particularly younger people, will turn around and say, “This is what you promised us?” They will not believe it and they will not like it.
All I heard coming through from the debate is that Ofcom will pick up a lot of complaints and use them to inform itself about what should happen five years down the track, the next time that the regulatory review takes place. That is not what we are about here. This is about filling a gap in a system for which promises have been issued over the seven long years that we have been waiting for the Bill. People out there expect the Bill to make their lives much more reasonable and to be respectful of their rights and responsibilities.
We find that the VSP provisions are being deleted—a system we already have, which at least does first approximation work. We find that we are reinforcing the inequality of arms between individuals and companies. We find that DCMS—it is not the same department because the Minister is now in DSIT, but it is his former sister department—is creating an ombudsman for gambling problems, having identified that they have gone too far, too fast, and are now out of control and need to be responded to. This just does not add up. The Government are in a mess. Please sort it. I beg leave to withdraw the amendment.
Amendment 56 withdrawn.
Clause 18: Duties about freedom of expression and privacy
Amendment 57 not moved.
58: Clause 18, page 20, line 32, at end insert “as defined under the Human Rights Act 1998 and its application to the United Kingdom.”
My Lords, I am delighted to propose this group of amendments on devolution issues. I am always delighted to see the Committee so full to talk about devolution issues. I will speak particularly to Amendments 58, 136, 225A and 228 in this group, all in my name. I am very grateful to the noble Lord, Lord Foulkes of Cumnock, for supporting them.
As I have said before in Committee, I have looked at the entire Bill from the perspective of a devolved nation, in particular at the discrepancies and overlaps of Scots law, UK law and ECHR jurisprudence that I was concerned had not been taken into account or addressed by the Bill as it stands. Many have said that they are not lawyers; I am also not. I am therefore very grateful to the Law Society of Scotland, members of Ofcom’s Advisory Committee for Scotland, and other organisations such as the Carnegie Trust and Legal to Say, Legal to Type, which have helped formulate my thinking. I also thank the Minister and the Bill team for their willingness to discuss these issues in advance with me.
When the first proposed Marshalled List for this Committee was sent round, my amendments were dotted all over the place. When I explained to the Whips that they were all connected to devolved issues and asked that they be grouped together, that must have prompted the Bill team to go and look again; the next thing I know, there is a whole raft of government amendments in this group referring to Wales, Northern Ireland, the Bailiwick of Guernsey and the Isle of Man—though not Scotland, I noted. These government amendments are very welcome; if nothing else, I am grateful to have achieved that second look from the devolved perspective.
In the previous group, we heard how long the Bill had been in gestation. I have the impression that, because online safety decision-making is a centralised and reserved matter, the regions are overlooked and engaged only at a late stage. The original internet safety Green Paper made no reference to Scotland at all; it included a section on education describing only the English education system and an annexe of legislation that did not include Scottish legislation. Thankfully, this oversight was recognised by the White Paper, two years later, which included a section on territorial scope. Following this, the draft Bill included a need for platforms to recognise the differences in legislation across the UK, but this was subsequently dropped.
I remain concerned that the particular unintended consequences of the Bill for the devolved Administrations have not been fully appreciated or explored. While online safety is a reserved issue, many of the matters that it deals with—such as justice, the police or education —are devolved, and, as many in this House appreciate, Scots law is different.
At the moment, the Bill is relatively quiet on how freedom of expression is defined; how it applies to the providers of user-to-user services and their duties to protect users’ rights to freedom of expression; and how platforms balance those competing rights when adjudicating on content removal. My Amendment 58 has similarities to Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead. It seeks to ensure that phrases such as “freedom of expression” are understood in the same way across the United Kingdom. As the noble and learned Lord pointed out when speaking to his Amendment 63 in a previous group, words matter, and I will therefore be careful to refer to “freedom of expression” rather than “freedom of speech” throughout my remarks.
Amendment 58 asks the Government to state explicitly which standards of speech platforms apply in each of the jurisdictions of the UK, because at this moment there is a difference. I accept that the Human Rights Act is a UK statute already, but, under Article 10—as we have heard—freedom of expression is not an absolute right and may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law.
The noble Lord, Lord Moylan, argued last week that the balance between freedom of expression and any condition or restriction was not an equal one but was weighted in favour of freedom of expression. I take this opportunity to take some issue with my noble friend, who is not in his place, on this. According to the Equality and Human Rights Commission, the British Institute of Human Rights and Supreme Court judgments, human rights are equal and indivisible, neither have automatic priority, and how they are balanced depends on the context and the particular facts.
In Scotland, the Scottish Government believe that they are protecting freedom of expression, but the Hate Crime and Public Order (Scotland) Act 2021 criminalises speech that is not illegal elsewhere in the UK. Examples from the Scottish Government’s own information note state that it is now an offence in Scotland
“if the urging of people to cease practising their religion is done in a threatening or abusive manner or, alternatively, … if a person were to urge people not to engage in same-sex sexual activity while making abusive comments about people who identify as lesbian, gay or bisexual”.
The Lord Advocate’s guidance to the police says that
“an incident must be investigated as a hate crime if it is perceived, by the victim or any other person, to be aggravated by prejudice”.
I stress that I make no absolutely comment about the merits, or otherwise, of the Hate Crime and Public Order (Scotland) Act. I accept that it is yet to be commenced. However, commencement is in the hands of the Scottish Parliament, not the Minister and his team, and I highlight it here as an illustration of the divergence of interpretation that is happening between the devolved nations now, and as an example of what could happen in the future.
So, I would have thought that we would want to take a belt-and-braces approach to ensuring that there cannot be any differences in interpretation of what we mean by freedom of expression, and I hope that the Minister will accept my amendment for the sake of clarity. Ofcom is looking for clarity wherever possible, and clarity will be essential for platforms. Amendment 58 would allow platforms to interpret freedom of expression as a legal principle, rather than having to adapt considerations for Scotland, and it would also help prevent Scottish users’ content being censored more than that of English users, as platforms could rely on a legally certain basis for decision-making.
The hate crime Act was also the motivation for my Amendment 136, which asks why the Government did not include it on the list of priority offences in Schedule 7. I understand that the Scottish Government did not ask for it to be included, but since when did His Majesty’s Government do what the Scottish Government ask of them?
I have assumed that the Scottish Government did not ask for it because the hate crime Act is yet to be commenced in Scotland and there are, I suspect, multiple issues to be worked out with Police Scotland and others before it can be. I stress again that it is not my intention that the Hate Crime and Public Order (Scotland) Act should dictate the threshold for illegal and priority illegal content in this Bill—Amendment 136 is a probing amendment—but the omission of the hate crime Act does raise the question of a devolution deficit because, while the definition of “illegal content” varies, people in areas of the UK with more sensitive thresholds would have to rely on the police to enforce some national laws online rather than benefiting from the additional protections of the Ofcom regime.
Clause 53(5)(c) of this Bill states that
“the offence is created by this Act or, before or after this Act is passed, by”—
this is in sub-paragraph (iv)—
“devolved subordinate legislation made by a devolved authority with the consent of the Secretary of State or other Minister of the Crown”.
How would this consent be granted? How would it involve this Parliament? What consultation should be required, and with whom—particularly since the devolved offence might change the thresholds for the offence across the whole of the UK? The phrase “consent of the Secretary of State” implies that a devolved authority would apply to seek consent. Should not this application process be set out in the Bill? What should the consultation process with devolved authorities and Ofcom be if the Secretary of State wishes to initiate the inclusion of devolved subordinate legislation? Do we not need a formal framework for parliamentary scrutiny—an equivalent of the Grimstone process, perhaps? I would be very happy to work with the Minister and his team on a Parkinson process between now and Report.
Amendments 225A and 228 seek to ensure that there is an analysis of users’ online experiences in the different nations of the UK. Amendment 225A would require Ofcom to ensure that its research into online experiences was analysed in a nation-specific way while Amendment 228 would require Ofcom’s transparency reporting to be reported via each nation. The fact is that, at this moment in time, we do not know whether there is a difference in the online experience across the four nations. For example, are rural or remote communities at greater risk of online harm because they have a greater dependence on online services? How would online platforms respond to harmful sectarian content? What role do communication technologies play in relation to offline violence, such as knife crime?
We can compare other data by nation, for example on drug use or gambling addiction. Research and transparency reporting are key to understanding nation-specific harms online, but I fear that Ofcom will have limited powers in this area if they are not specified in the Bill. Ofcom has good working relationships from the centre with the regions, and part of this stems from the fact that legislation in other sectors, such as broadcasting, requires it to have advisory committees in each of the nations to ensure that English, Scottish, Northern Irish and Welsh matters are considered properly. Notably, those measures do not exist in this Bill.
The interplay between the high-level and reserved nature of internet services and online safety will require Ofcom to develop a range of new, wider partnerships in Scotland—for example with Police Scotland—and to collaborate closely at a working level with a wide range of interests within the Scottish Government, where such interests will be split across a range of ministerial portfolios. In other areas of its regulatory responsibility, Ofcom’s research publications provide a breakdown of data by nation. Given the legislative differences that already exist between the four nations, it is an omission that such a breakdown is not explicitly required in the Bill.
I have not touched—and I am not going to touch—on how this Bill might affect other devolved Administrations. The noble Baroness, Lady Foster of Aghadrumsee, apologises for being unable to be in the Chamber to lend her voice from a Northern Ireland perspective— I understand from her that the Justice (Sexual Offences and Trafficking Victims) Act (Northern Ireland) 2022 might be another example of this issue—but she has indicated her support here. As my noble friend Lady Morgan of Cotes said last Thursday:
“The Minister has done a very good job”
“batting away amendments”.—[Official Report, 11/5/23; col. 2043.]
However, I am in an optimistic mood this afternoon, because the Minister responded quite positively to the request from the noble and learned Lord, Lord Hope, that we should define “freedom of expression”. There is great benefit to be had from ensuring that this transparency of reporting and research can be broken down by nation. I am hopeful, therefore, that the Minister will take the points that I have raised through these amendments and that he will, as my noble friend Lady Morgan of Cotes hoped, respond by saying that he sees my points and will work with me to ensure that this legislation works as we all wish it to across the whole of the UK. I beg to move.
My Lords, I warmly support the amendment moved by the noble Baroness, Lady Fraser of Craigmaddie, to which I have added my name. I agree with every word she said in her introduction. I could not have said it better and I have nothing to add.
My Lords, I follow the noble Lord, Lord Foulkes, with just a few words. As we have been reminded, I tabled Amendment 63, which has already been debated. The Minister will remember that my point was about legal certainty; I was not concerned with devolution, although I mentioned Amendment 58 just to remind him that we are dealing with all parts of the United Kingdom in the Bill and it is important that the expression should have the same meaning throughout all parts.
We are faced with the interesting situation which arose in the strikes Bill: the subject matter of the Bill is reserved, but one must have regard to the fact that its effects spread into devolved areas, which have their own systems of justice, health and education. That is why there is great force in the point that the noble Baroness, Lady Fraser, has been making. I join the noble Lord, Lord Foulkes, in endorsing what she said without going back into the detail, but remind the Minister that devolution exists, even though we are dealing with reserved matters.
My Lords, this is unfamiliar territory for me, but the comprehensive introduction of the noble Baroness, Lady Fraser, has clarified the issue. I am only disappointed that we had such a short speech from the noble Lord, Lord Foulkes—uncharacteristic, perhaps I could say—but it was good to hear from the noble and learned Lord, Lord Hope, on this subject as well. The noble Baroness’s phrase “devolution deficit” is very useful shorthand for some of these issues. She has raised a number of questions about the Secretary of State’s powers under Clause 53(5)(c): the process, the method of consultation and whether there is a role for Ofcom’s national advisory committees. Greater transparency in order to understand which offences overlap in all this would be very useful. She deliberately did not go for one solution or another, but issues clearly arise where the thresholds are different. It would be good to hear how the Government are going to resolve this issue.
My Lords, it is a pity that we have not had the benefit of hearing from the Minister, because a lot of his amendments in this group seem to bear on some of the more generic points made in the very good speech by the noble Baroness, Lady Fraser. I assume he will cover them, but I wonder whether he would at least be prepared to answer any questions people might come back with—not in any aggressive sense; we are not trying to scare the pants off him before he starts. For example, the points made by the noble Lord, Lord Clement-Jones, intrigue me.
I used to have responsibility for devolved issues when I worked at No. 10 for a short period. It was a bit of a joke, really. Whenever anything Welsh happened, I was immediately summoned down to Cardiff and hauled over the coals. You knew when you were in trouble when they all stopped speaking English and started speaking Welsh; then, you knew there really was an issue, whereas before I just had to listen, go back and report. In Scotland, nobody came to me anyway, because they knew that the then Prime Minister was a much more interesting person to talk to about these things. They just went to him instead, so I did not really learn very much.
I noticed some issues in the Marshalled List that I had not picked up on when I worked on this before. I do not know whether the Minister wishes to address this—I do not want to delay the Committee too much—but are we saying that to apply a provision in the Bill to the Bailiwick of Guernsey or the Isle of Man, an Order in Council is required to bypass Parliament? Is that a common way of proceeding in these places? I suspect that the noble and learned Lord, Lord Hope, knows much more about this than I do—he shakes his head—but this is a new one on me. Does it mean that this Parliament has no responsibility for how its laws are applied in those territories, or are there other procedures of which we are unaware?
My second point again picks up what the noble Lord, Lord Clement-Jones, was saying. Could the Minister go through in some detail the process by which a devolved authority would apply to the Secretary of State—presumably for DSIT—to seek consent for a devolved offence to be included in the Online Safety Bill regime? If this is correct, who grants to what? Does this come to the House as a statutory instrument? Is just the Secretary of State involved, or does it go to the Privy Council? Are there other ways that we are yet to know about? It would be interesting to know.
To echo the noble Lord, Lord Clement-Jones, we probably do need a letter from the Minister, if he ever gets this cleared, setting out exactly how the variation in powers would operate across the four territories. If there are variations, we would like to know about them.
My Lords, I am very grateful to my noble friend Lady Fraser of Craigmaddie for her vigilance in this area and for the discussion she had with the Bill team, which they and I found useful. Given the tenor of this short but important debate, I think it may be helpful if we have a meeting for other noble Lords who also want to benefit from discussing some of these things in detail, and particularly to talk about some of the issues the noble Lord, Lord Stevenson of Balmacara, just raised. It would be useful for us to talk in detail about general questions on the operation of the law before we look at this again on Report.
In a moment, I will say a bit about the government amendments which stand in my name. I am sure that noble Lords will not be shy in taking the opportunity to interject if questions arise, as they have not been shy on previous groups.
I will start with the amendments tabled by my noble friend Lady Fraser. Her Amendment 58 seeks to add reference to the Human Rights Act 1998 to Clause 18. That Act places obligations on public authorities to act compatibly with the European Convention on Human Rights. It does not place obligations on private individuals and companies, so it would not make sense for such a duty on internet services to refer to the Human Rights Act.
Under that Act, Ofcom has obligations to act in accordance with the right to freedom of expression under Article 10 of the European Convention on Human Rights. As a result, the codes that Ofcom draws up will need to comply with the Article 10 right to freedom of expression. Schedule 4 to the Bill requires Ofcom to ensure that measures which it describes in a code of practice are designed in light of the importance of protecting the right of users’
“freedom of expression within the law”.
Clauses 44(2) and (3) provide that platforms will be treated as complying with their freedom of expression duty if they take the recommended measures that Ofcom sets out in the codes. Platforms will therefore be guided by Ofcom in taking measures to comply with its duties, including safeguards for freedom of expression through codes of practice.
My noble friend’s Amendment 136 seeks to add offences under the Hate Crime and Public Order (Scotland) Act 2021 to Schedule 7. Public order offences are already listed in Schedule 7 to the Bill, which will apply across the whole United Kingdom. This means that all services in scope will need proactively to tackle content that amounts to an offence under the Public Order Act 1986, regardless of where the content originates or where in the UK it can be accessed.
The priority offences list has been developed with the devolved Administrations, and Clause 194 outlines the parliamentary procedures for updating it. The requirements for consent will be set out in the specific subordinate legislation that may apply to the particular offence being made by the devolved authorities—that is to say, they will be laid down by the enabling statutes that Parliament will have approved.
Amendment 228 seeks to require the inclusion of separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in Ofcom’s transparency reports. These transparency reports are based on the information requested from category 1, 2A and 2B service providers through transparency reporting. I assure my noble friend that Ofcom is already able to request country-specific information from providers in its transparency reports. The legislation sets out high-level categories of information that category 1, 2A and 2B services may be required to include in their transparency reports. The regulator will set out in a notice the information to be requested from the provider, the format of that information and the manner in which it should be published. If appropriate, Ofcom may request specific information in relation to each country in the UK, such as the number of users encountering illegal content and the incidence of such content.
Ofcom is also required to undertake consultation before producing guidance about transparency reporting. In order to ensure that the framework is proportionate and future-proofed, however, it is vital to allow the regulator sufficient flexibility to request the types of information that it sees as relevant, and for that information to be presented by providers in a manner that Ofcom has deemed to be appropriate.
Similarly, Amendment 225A would require separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in Ofcom’s research about users’ experiences of regulated services. Clause 141 requires that Ofcom make arrangements to undertake consumer research to ascertain public opinion and the experiences of UK users of regulated services. Ofcom will already be able to undertake this research on a country-specific basis. Indeed, in undertaking its research and reporting duties, as my noble friend alluded to, Ofcom has previously adopted such an approach. For instance, it is required by the Communications Act 2003 to undertake consumer research. While the legislation does not mandate that Ofcom conduct and publish nation-specific research, Ofcom has done so, for instance through its publications Media Nations and Connected Nations. I hope that gives noble Lords some reassurance of its approach in this regard. Ensuring that Ofcom has flexibility in carrying out its research functions will enable us to future-proof the regulatory framework, and will mean that its research activity is efficient, relevant and appropriate.
I will now say a bit about the government amendments standing in my name. I should, in doing so, highlight that I have withdrawn Amendments 304C and 304D, previously in the Marshalled List, which will be replaced with new amendments to ensure that all the communications offences, including the new self-harm offence, have the appropriate territorial extent when they are brought forward. They will be brought forward as soon as possible once the self-harm offence has been tabled.
Amendments 267A, 267B, 267C, 268A, 268B to 268G, 271A to 271D, 304A, 304B and 304E are amendments to Clauses 160, 162, 164 to 166, 168 and 210 and Schedule 14, relating to the extension of the false and threatening communications offences and the associated liability of corporate officers in Clause 166 to Northern Ireland.
This group also includes some technical and consequential amendments to the false and threatening communications offences and technical changes to the Malicious Communications (Northern Ireland) Order 1988 and Section 127 of the Communications Act 2003. This will minimise overlap between these existing laws and the new false and threatening communications offences in this Bill. Importantly, they mirror the approach taken for England and Wales, providing consistency in the criminal law.
This group also contains technical amendments to update the extent of the epilepsy trolling offence to reflect that it applies to England, Wales and Northern Ireland.
Amendment 286B is a technical amendment to repeal a provision in the Digital Economy Act 2017 that will become redundant when Part 3 of that Act is repealed by this Bill.
Amendments 304F and 304G give the Bailiwick of Guernsey and the Isle of Man the power to extend the Online Safety Bill to their jurisdictions, should they wish. Amendments 304A and 304H to 304K have been tabled to reflect the Bailiwick of Jersey opting to forgo a permissive extent clause in this instance.
With the offer of a broader meeting to give other noble Lords the benefit of the discussions with the Bill team that my noble friend has had—I extend that invitation to her, of course, to continue the conversation with us—I hope that provides information about the government amendments in this group and some reassurance on the points that my noble friend has made. I hope that she will be willing to withdraw her amendment and that noble Lords will accept the government amendments.
I suggested that we might see a table, independent of the meetings, although I am sure they could coincide. Would it be possible to have a table of all the criminal offences that the Minister listed and how they apply in each of the territories? Without that, we are a bit at sea as to exactly how they apply.
I put on record that the withdrawal of Part 3 of the Digital Economy Act 2017 will be greeted with happiness only should the full schedule of AV and harms be put into the Bill. I must say that because the noble Baroness, Lady Benjamin, is not in her place. She worked very hard for that piece of legislation.
My Lords, I thank the Minister for his response. I take it as a win that we have been offered a meeting and further discussion, and the noble Lord, Lord Foulkes, agreeing with every word I said. I hope we can continue in this happy vein in my time in this House.
The suggestion from the noble Lord, Lord Stevenson, of a table is a welcome one. Something that has interested me is that some of the offences the Minister mentioned were open goals: there were holes leaving it open in Northern Ireland and not in England and Wales, or whatever. For example, epilepsy trolling is already a criminal offence in Scotland, but I am not sure that was appreciated when we started this discussion.
I look forward to the meeting and I thank the Minister for his response. I am still unconvinced that we have the right consultation process for any devolved authority wanting to apply for a subordinate devolved Administration to be included under this regime.
It concerns me that the Minister talked about leaving requesting data that Ofcom deemed to be appropriate. The feeling on the ground is that Ofcom, which is based in London, may not understand what is or is not necessarily appropriate in the devolved Administrations. The fact that in other legislation—for example, on broadcasting—it is mandated that it is broken down nation by nation is really important. It is even more important because of the interplay between the devolved and the reserved matters. The fact that there is no equivalent Minister in the Scottish Government to talk about digital and online safety things with means that a whole raft of different people will need to have relationships with Ofcom who have not hitherto.
I thank the Minister. On that note, I withdraw my amendment.
Amendment 58 withdrawn.
Amendments 59 to 64 not moved.
Clause 18 agreed.
Clause 19: Record-keeping and review duties
Amendments 64A to 64D
64A: Clause 19, page 21, line 36, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 1 services.
64B: Clause 19, page 21, line 36, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 1 services.
64C: Clause 19, page 21, line 38, after “of” insert “all aspects of”
Member’s explanatory statement
This amendment concerns a duty imposed on providers to keep records of risk assessments. The added words make it clear that full records must be made.
64D: Clause 19, page 21, line 38, at end insert “, including details about how the assessment was carried out and its findings.”
Member’s explanatory statement
This amendment concerns a duty imposed on providers to keep records of risk assessments. The added words make it clear that full records must be made.
Amendments 64A to 64D agreed.
Amendments 65 and 65ZA not moved.
65A: Clause 19, page 22, line 26, at end insert—
“(8A) As soon as reasonably practicable after making a record of a risk assessment as required by subsection (2), or revising such a record, a duty to supply OFCOM with a copy of the record (in full).”Member’s explanatory statement
This amendment requires providers of Category 1 services to supply copies of their records of risk assessments to OFCOM. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.
Amendment 65A agreed.
Amendment 65AA not moved.
Clause 19, as amended, agreed.
Clause 20: Providers of search services: duties of care
Amendments 65B to 65E
65B: Clause 20, page 23, line 5, leave out “and (3)” and insert “to (3A)”
Member’s explanatory statement
This technical amendment is consequential on the other changes to clause 20 (arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only - see the amendments in the Minister’s name to those clauses below).
65C: Clause 20, page 23, line 10, at end insert “(2) to (8)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 23 below (because the new duty to summarise illegal content risk assessments in a publicly available statement is only imposed on providers of Category 2A services).
65D: Clause 20, page 23, line 15, at end insert “(2) to (6)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 29 below (because the new duty to supply records of risk assessments to OFCOM is only imposed on providers of Category 2A services).
65E: Clause 20, page 23, line 15, at end insert—
“(2A) Additional duties must be complied with by providers of particular kinds of regulated search services, as follows.”Member’s explanatory statement
This technical amendment is consequential on the other changes to clause 20 (arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only - see the amendments in the Minister’s name to those clauses below).
Amendments 65B to 65E agreed.
Amendment 66 not moved.
Amendments 66A to 66D
66A: Clause 20, page 23, line 16, leave out “In addition,”
Member’s explanatory statement
This technical amendment is consequential on the other changes to clause 20 (arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only - see the amendments in the Minister’s name to those clauses below).
66B: Clause 20, page 23, line 20, at end insert “(2) to (8)”
Member’s explanatory statement
This amendment is consequential on the amendments in the Minister’s name to clause 25 below (because the new duty to summarise children’s risk assessments in a publicly available statement is only imposed on providers of Category 2A services).
66C: Clause 20, page 23, line 20, at end insert—
“(3A) All providers of regulated search services that are Category 2A services must comply with the following duties in relation to each such service which they provide—(a) the duty about illegal content risk assessments set out in section 23(8A),(b) the duty about children’s risk assessments set out in section 25(8A), and(c) the duty about record-keeping set out in section 29(8A).”Member’s explanatory statement
This amendment ensures that the new duties set out in the amendments in the Minister’s name to clauses 23, 25 and 29 below (duties to summarise risk assessments in a publicly available statement and to supply records of risk assessments to OFCOM) are imposed on providers of Category 2A services only.
66D: Clause 20, page 23, line 21, at end insert—
“(5) For the meaning of “Category 2A service”, see section 86 (register of categories of services).”Member’s explanatory statement
This amendment inserts a signpost to the meaning of “Category 2A service”.
Amendments 66A to 66D agreed.
Clause 20, as amended, agreed.
Clause 21 agreed.
Clause 22: Illegal content risk assessment duties
Amendment 66DA not moved.
66E: Clause 22, page 24, line 38, after “29(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 29 about supplying records of risk assessments to OFCOM.
Amendment 66E agreed.
Clause 22, as amended, agreed.
Clause 23: Safety duties about illegal content
Amendments 66F and 66G
66F: Clause 23, page 24, line 42, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to summarise illegal content risk assessments in a publicly available statement (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
66G: Clause 23, page 24, line 42, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to summarise illegal content risk assessments in a publicly available statement (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
Amendments 66F and 66G agreed.
Amendments 67 to 72 not moved.
72A: Clause 23, page 25, line 31, at end insert—
“(8A) A duty to summarise in a publicly available statement the findings of the most recent illegal content risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to individuals).”Member’s explanatory statement
This amendment requires providers of Category 2A services to summarise (in a publicly available statement) the findings of their latest risk assessment regarding illegal content. The limitation to Category 2A services is achieved by an amendment in the name of the Minister to clause 20.
Amendment 72A agreed.
Clause 23, as amended, agreed.
Amendment 73 not moved.
Clause 24: Children’s risk assessment duties
Amendments 74 and 75 not moved.
75A: Clause 24, page 26, line 45, after “29(2)” insert “and (8A)”
Member’s explanatory statement
This amendment inserts a signpost to the new duty in clause 29 about supplying records of risk assessments to OFCOM.
Amendment 75A agreed.
Clause 24, as amended, agreed.
Clause 25: Safety duties protecting children
75B: Clause 25, page 27, line 4, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to summarise children’s risk assessments in a publicly available statement (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
Amendment 75B agreed.
Amendments 76 to 81 not moved.
81A: Clause 25, page 27, line 46, at end insert—
“(8A) A duty to summarise in a publicly available statement the findings of the most recent children’s risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to children).”Member’s explanatory statement
This amendment requires providers of Category 2A services to summarise (in a publicly available statement) the findings of their latest children’s risk assessment. The limitation to Category 2A services is achieved by an amendment in the name of the Minister to clause 20.
Amendment 81A agreed.
Amendments 82 to 85 not moved.
Clause 25, as amended, agreed.
Clause 26: Duty about content reporting
Amendment 86 not moved.
Clause 26 agreed.
Clause 27: Duties about complaints procedures
Amendment 87 not moved.
Clause 27 agreed.
Clause 28: Duties about freedom of expression and privacy
Amendment 88 not moved.
Clause 28 agreed.
Clause 29: Record-keeping and review duties
Amendments 88A to 88D
88A: Clause 29, page 31, line 4, leave out “all”
Member’s explanatory statement
This is a technical amendment needed because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
88B: Clause 29, page 31, line 4, at end insert “(as indicated by the headings).”
Member’s explanatory statement
This amendment provides clarification because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.
88C: Clause 29, page 31, line 6, after “of” insert “all aspects of”
Member’s explanatory statement
This amendment concerns a duty imposed on providers to keep records of risk assessments. The added words make it clear that full records must be made.
88D: Clause 29, page 31, line 6, at end insert “, including details about how the assessment was carried out and its findings.”
Member’s explanatory statement
Amendments 88A to 88D agreed.
Amendments 89 and 90 not moved.
90A: Clause 29, page 31, line 37, at end insert—
“(8A) As soon as reasonably practicable after making a record of a risk assessment as required by subsection (2), or revising such a record, a duty to supply OFCOM with a copy of the record (in full).”Member’s explanatory statement
This amendment requires providers of Category 2A services to supply copies of their records of risk assessments to OFCOM. The limitation to Category 2A services is achieved by an amendment in the name of the Minister to clause 20.
Amendment 90A agreed.
Amendment 90B not moved.
Clause 29, as amended, agreed.
Amendments 91 and 91A not moved.
Clause 30: Children’s access assessments
Amendment 92 not moved.
Clause 30 agreed.
Clause 31 agreed.
Schedule 3 agreed.
Amendment 93 not moved.
Clause 32 agreed.
Clause 33: Duties about fraudulent advertising: Category 1 services
Amendment 94 not moved.
Clause 33 agreed.
Clause 34: Duties about fraudulent advertising: Category 2A services
Amendment 95 not moved.
Clause 34 agreed.
Clause 35 agreed.
96: After Clause 35, insert the following new Clause—
“Suicide or self-harm content duties
(1) This section sets out the duties about harmful suicide or self-harm content which apply to all regulated user-to-user services and providers of search services.(2) This section applies in respect of all service users.(3) A duty to include provisions in the terms of service specifying the treatment to be applied in relation to harmful suicide or self-harm content.(4) The possible kinds of treatment of content referred to in subsection (3) are—(a) taking down the content;(b) restricting users’ access to the content;(c) limiting the recommendation or promotion of the content.(5) A duty to explain in the terms of service the provider’s response to the risks relating to harmful suicide or self- harm content by reference to—(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and(b) any other provisions of the terms of service designed to mitigate or manage those risks.(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—(a) are clear and accessible, and(b) are applied consistently in relation to content which meets the definition in section 207.”Member’s explanatory statement
This creates a duty for providers of regulated user-to-user services and search services to manage harmful suicide or self-harm content, applicable to both children and adults.
I am particularly grateful to the noble Lords who co-signed Amendments 96, 240 and 296 in this group. Amendment 225 is also important and warrants careful consideration, as it explicitly includes eating disorders. These amendments have strong support from Samaritans, which has helped me in drafting them, and from the Mental Health Foundation and the BMA. I declare that I am an elected member of the BMA ethics committee.
We have heard much in Committee about the need to protect children online more effectively even than in the Bill. On Tuesday the noble Baroness, Lady Morgan of Cotes, made a powerful speech acknowledging that vulnerability does not stop at the age of 18 and that the Bill currently creates a cliff edge whereby there is protection from harmful content for those under 18 but not for those over 18. The empowerment tools will be futile for those seriously contemplating suicide and self-harm. No one should underestimate the power of suicide contagion and the addictive nature of the content that is currently pushed out to people, goading them into such actions and drawing them into repeated viewings.
Amendment 96 seeks to redress that. It incorporates a stand-alone provision, creating a duty for providers of user-to-user services to manage harmful content about suicide or self-harm. This provision would operate as a specific category, relevant to all regulated services and applicable to both children and adults. Amendment 296 defines harmful suicide or self-harm content. It is important that we define that to avoid organisations such as Samaritans, which provide suicide prevention support, being inadvertently caught up in clumsy, simplistic search engine categorisation.
Suicide and self-harm content affects people of all ages. Adults in distress search the internet, and children easily bypass age-verification measures and parental controls even when the have been switched on. The Samaritans Lived Experience Panel reported that 82% of people who died by suicide, having visited websites that encouraged suicide and/or methods of self-harm, were over the age of 25.
Samaritans considers that the types of suicide and self-harm content that are legal but unequivocally harmful include, but are not limited to, information, depictions, instructions and advice on methods of self-harm and suicide; content that portrays self-harm and suicide as positive or desirable; and graphic descriptions or depictions of self-harm and suicide. As the Bill stands, platforms will not even need to consider the risk that such content could pose to adults. This will leave all that dangerous online content widely available and undermines the Bill’s intention from the outset.
Last month, other parliamentarians and I met Melanie, whose relative Jo died by suicide in 2020. He was just 23. He had accessed suicide-promoting content online, and his family are speaking out to ensure that the Bill works to avoid future tragedies. A University of Bristol study reported that those with severe suicidal thoughts actively use the internet to research effective methods and often find clear suggestions. Swansea University reported that three quarters of its research participants had harmed themselves more severely after viewing self-harm content online.
Amendment 240 complements the other amendments in this group, although it would not rely on them to be effective. It would establish a specific unit in Ofcom to monitor the prevalence of suicide, self-harm and harmful content online. I should declare that this is in line with the Private Member’s Bill I have introduced. In practice, that means that Ofcom would need to assess the efficacy of the legislation in practice. It would require Ofcom to investigate the content and the algorithms that push such content out to individuals at an alarming rate.
Researchers at the Center for Countering Digital Hate set up new accounts in the USA, UK, Canada and Australia at the minimum age TikTok allows, which is 13. These accounts paused briefly on videos about body image and mental health, and “liked” them. Within 2.6 minutes, TikTok recommended suicide content, and it sent content on eating disorders within eight minutes.
Ofcom’s responsibility for ongoing review and data collection, reported to Parliament, would take a future-facing approach covering new technologies. New communications and internet technologies are being developed at pace in ways we cannot imagine. The term
“in a way equivalent … to”
in Amendment 240 is specifically designed to include the metaverse, where interactions are instantaneous, virtual and able to incite, encourage or provoke serious harm to others.
We increasingly live our lives online. Social media is expanding, while user-to-user sites are now shopping platforms for over 70% of UK consumers. However, online is also being used to sell suicide kits or lethal substances, as recently covered in the press. It is important that someone holds the responsibility for reporting on dangers in the online world. Harmful suicide content methods and encouragement were found through a systematic review to be massed on sites with low levels of moderation and easy search functions for images. Some 78% of people with lived experience of suicidality and self-harm surveyed by Samaritans agree that new laws are needed to make online spaces safer.
I urge noble Lords to support my amendments, which aim to ensure that self-harm, suicide and seriously harmful content is addressed across all platforms in all categories as well as search engines, regardless of their functionality or reach, and for all persons, regardless of age. Polling by Samaritans has shown high support for this: four out of five agree that harmful suicide and self-harm content can damage adults as well as children, while three-quarters agree that tech companies should by law prevent such content being shown to users of all ages.
If the Government are not minded to adopt these amendments, can the Minister tell us specifically how the Bill will take a comprehensive approach to placing duties on all platforms to reduce dangerous content promoting suicide and self-harm? Can the Government confirm that smaller sites, such as forums that encourage suicide, will need to remove priority illegal content, whatever the level of detail in their risk assessment? Lastly—I will give the Minister a moment to note my questions—do the Government recognise that we need an amendment on Report to create a new offence of assisting or encouraging suicide and serious self-harm? I beg to move.
My Lords, I particularly support Amendment 96, to which I have added my name; it is a privilege to do so. I also support Amendment 296 and I cannot quite work out why I have not added my name to it, because I wholeheartedly agree with it, but I declare my support now.
I want to talk again about an issue that the noble Baroness, Lady Finlay, set out so well and that we also touched on last week, about the regulation of suicide and self-harm content. We have all heard of the tragic case of Molly Russell, but a name that is often forgotten in this discussion is Frankie Thomas. Frankie was a vulnerable teenager with childhood trauma, functioning autism and impulsivity. After reading a story about self-harm on the app Wattpad, according to the coroner’s inquest, she went home and undertook
“a similar act, resulting in her death”.
I do not need to repeat the many tragic examples that have already been shared in this House, but I want to reiterate the point already made by the BMA in its very helpful briefing on these amendments: viewing self-harm and suicide content online can severely harm the user offline. As I said last week when we were debating the user empowerment tools, this type of content literally has life or death repercussions. It is therefore essential that the Bill takes this sort of content more seriously and creates specific duties for services to adhere to.
We will, at some point this evening—I hope—come on to debate the next group of amendments. The question for Ministers to answer on this group, the next one and others that we will be debating is, where we know that content is harmful to society—to individuals but also to broader society—why the Government do not want to take the step of setting out how that content should be properly regulated. I think it all comes from their desire to draw a distinction between content that is illegal and content that is not illegal but is undoubtedly, in the eyes of pretty well every citizen, deeply harmful. As we have already heard from the noble Baroness, and as we heard last week, adults do not become immune to suicide and self-harm content the minute they turn 18. In fact, I would argue that no adult is immune to the negative effects of viewing this type of content online.
This amendment, therefore, is very important, as it would create a duty for providers of regulated user-to-user services and search engines to manage harmful suicide or self-harm content applicable to both children and adults, recognising this cliff edge otherwise in the Bill, which we have already talked about. I strongly urge noble Lords, particularly the Minister, to agree that protecting users from this content is one of the most important things that the Bill can do. People outside this House are looking to us to do this, so I urge the Government to support this amendment today.
My Lords, I am pleased that we have an opportunity, in this group of amendments, to talk about suicide and self-harm content, given the importance of it. It is important to set out what we expect to happen with this legislation. I rise particularly to support Amendment 225, to which my noble friend Lady Parminter added her name. I am doing this more because the way in which this kind of content is shared is incredibly complex, rather than simply because of the question of whether it is legal or illegal.
Our goal in the regulations should actually be twofold. First, we do, of course, want to reduce the likelihood that somebody who is at lower risk of suicide and self-harm might move into the higher-risk group as a result of their online activity on user-to-user services. That is our baseline goal. We do not want anyone to go from low risk to high risk. Secondly, as well as this “create no new harm” goal, we have a harm reduction goal, which is that people who are already at a higher risk of suicide and self-harm might move into a lower-risk category through the use of online services to access advice and support. It is really important, in this area, that we do not lose sight of the fact there are two aspects to the use of online services. It is no simple task to try to achieve both these goals, as they can sometimes be in tension with each other in respect of particular content types.
There is a rationale for removing all suicide and self-harm content, as that is certainly a way to achieve that first goal. It makes it less likely that a low-risk person would encounter—in the terms of the Bill—or be exposed to potentially harmful content. Some countries certainly do take that approach. They say that anything that looks like it is encouraging suicide or self-harm should be removed, full stop. That is a perfectly rational and legitimate approach.
There is, however, a cost to this approach, which I wish to tease out. It would be helpful in this debate to understand that, and it might not be immediately apparent what that cost is. It is that there are different kinds of individuals posting this content, so if we look at the experience of what happens on online platforms, there certainly is a community of people who post content with the express aim of hurting others: people who we often call trolls, who are small in number but incredibly toxic. They are putting out suicide and self-harm content because they want other people to suffer. They might think it is funny, but whatever they think, they are doing it with an expressly negative intent.
There is also a community of individuals and organisations who believe that they are sharing content to help those who are at risk. This can vary: some can be formal organisations such as the Samaritans, and others can be enterprising individuals, sometimes people who themselves had experiences that they wish to share, who will create online fora and share content. It might be content that looks similar to content that appears harmful, but their expressed goal is seeking to help others online. Most of these fora are for that purpose. Then there are the individuals themselves, who are looking for advice and support relevant to what is happening in their own lives and to connect with others who share their experiences.
We might see the same piece of content very differently when posted by people in these groups. If an individual in that troll group is showing an image of self-harm, that is an aggressive, harmful act; there is no excuse for it, and we want to get rid of it. However, the same content might be part of an educational exchange when posted by an expert organisation. The noble Baroness, Lady Finlay, said that we needed to make sure that this new legislation did not inadvertently sweep up those who were in that educational space.
The hardest group is the group of individuals, where, in many cases, the posting of that content is a cry for help, and an aggressive response by the platform can, sadly, be counterproductive to that individual if they have gone online to seek help. The effect of that is that the content is removed and, because they violated the platform’s terms of service, that person who is feeling lonely and vulnerable might lose social media accounts that are important to them for seeking help. Therefore, by seeking to reduce their exposure to content, we might inadvertently end up creating a scenario in which they lose all that is valuable to them. That is the other inadvertent harm that we want to ensure we avoid in regulating and seeking to have Ofcom issue the most appropriate guidance.
We should be able to advance both goals: removing the content that is posted with harmful intent but enabling content that is there as a cry for help, or as a support and advice service. It is in that context that something like the proposal for an expert group for Ofcom is very helpful. Again, having worked at a platform, I can say that we often reached out to advisers and sought help. Sometimes, the advice was conflicting. Some people would say it was really important that if someone was sharing images of self-harm they should be got rid of; others would say that, in certain contexts, it was really important to allow that person to share the image of self-harm and have a discussion with others—and that maybe the response was to use that as a trigger, to point them towards a support service that they need.
Again, when somebody is at imminent risk of suicide, protocols were developed to deal with that when the solution is nothing that the platform can do. If a platform has detected that somebody is at imminent risk of suicide, it needs to find a way to ensure that either a support body such as the Samaritans or, in many cases, the police are notified so that they can go to that person’s house, knock on the door and prevent the suicide happening. Platforms in some countries have the relationships that they need with local bodies. Giving that advice is very sensitive; you are disclosing highly sensitive personal data to an outside body, against the individual’s wishes. There will not be consent from them, in many cases, and that has to be worked through.
If we are thinking about protocols for dealing with self-harm content, we will reach some of the same issues. It may be that informing parents, a school or some other body to get help to that individual would be the right thing to do. That is very sensitive in terms of the data disclosure and privacy aspects.
The Bill is an opportunity to improve all of this. There are pieces of very good practice and, clearly, areas where not enough is being done and too much very harmful content—particularly content that is posted with the express intent of causing harm—is being allowed to circulate. I hope that, through the legislation and by getting these protocols right, we can get to the point where we are both preventing lower-risk people moving into a higher-risk category and enabling people already in a high-risk category to get the help, support and advice that they need. Nowadays, online is often the primary tool that could benefit them.
My Lords, as usual, the noble Lord, Lord Allan of Hallam, has explained with some nuance the trickiness of this area, which at first sight appears obvious—black and white—but is not quite that. I want to explore some of these things.
Many decades ago, I ran a drop-in centre for the recovering mentally ill. I remember my shock the first time that I came across a group of young women who had completely cut themselves up. It was a world that I did not know but, at that time, a very small world—a very minor but serious problem in society. Decades later, going around doing lots of talks, particularly in girls’ schools where I am invited to speak, I suddenly discovered that whole swathes of young women were doing something that had been considered a mental health problem, often hidden away. Suddenly, people were talking about a social contagion of self-harm happening in the school. Similarly, there were discussions about eating disorders being not just an individual mental health problem but something that kind of grew within a group.
Now we have the situation with suicide sites, which are phenomenal at exploiting those vulnerabilities. This is undoubtedly a social problem of some magnitude. I do not in any way want to minimise it, but I am not sure exactly how legislation can resolve it or whether it will, even though I agree that it could do certain things.
Some of the problems that we have, which have already been alluded to, really came home to me when I read about Instagram bringing in some rules on self-harm material, which ended up with the removal of posts by survivors of self-harm discussing their illness. I read a story in the Coventry Evening Telegraph—I was always interested because I ran the drop-in centre for the recovering mentally ill in Coventry—where a young woman had some photographs taken down from Instagram because they contained self-harm images of her scars. The young woman who had suffered these problems, Marie from Tile Hill, wanted to share pictures of her scars with other young people because she felt that they would help others recover. She had got over it and was basically saying that the scars were healing. In other words, it was a kind of self-help group for users online, yet it was taken down.
It is my usual problem: this looks to be a clear-cut case, yet the complexities can lead to problems of censorship of a sort. I was really pleased that the noble Baroness, Lady Finlay, stressed the point about definitions. Search engines such as Google have certainly raised the problem of a concern, or worry, that people looking for help—or even looking to write an essay on suicide, assisted suicide or whatever—will end up not being able to find appropriate material.
I also want to ask a slightly different question. Who decides which self-harms are in our definitions and what is the contagion? When I visit schools now, there is a new social contagion in town, I am afraid to say, which is that of gender dysphoria. In the polling for a newly published report Show, Tell and Leave Nothing to the Imagination by Jo-Anne Nadler, which has just come out, half of the young people interviewed said that they knew someone at their school who wanted to change gender or had already, while one in 10 said that they wanted to change their gender.
That is just an observation; your Lordships might ask what it has to do with the Bill. But these are actually problem areas that are being affirmed by educational organisations and charities, by which I mean that organisations that have often worked with government have been consulted as stakeholders. They have recommended to young women where online to purchase chest binders, which will stop them developing, or where and how to use puberty blockers. Eventually, they are affirming double mastectomies or castration. By the way, this is of course all over social media, because once you start to search on it, TikTok is rife with it. Algorithmically, we are reminded all the time to think about systems: once you have had a look at it, it is everywhere. My point is that this is affirmed socially.
Imagine a situation whereby, in society offline, some young woman who has an eating disorder and weighs 4 stone comes to you and says “Look, I’m so fat”. If you said, “Yes, of course you’re fat—I’ll help you slim”, we would think it was terrible. When that happens online, crudely and sometimes cruelly, we want to tackle it here. If some young woman came to you and said, “I want to self-harm; I feel so miserable that I want to cut myself”, and you started recommending blades, we would think it was atrocious behaviour. In some ways, that is what is happening online and that is where I have every sympathy with these amendments. Yet when it comes to gender dysphoria, which actually means encouraging self-harm, because it is a cultural phenomenon that is popular it does not count.
In some ways, I could be arguing that we should future-proof this legislation by including those self-harms in the definition put forward by the amendments in this group. However, I raise it more to indicate that, as with all definitions, it is not quite as easy as one would think. I appreciate that a number of noble Lords, and others engaged in this discussion, might think that I am merely exhibiting prejudice rather than any genuine compassion or concern for those young people. I would note that if noble Lords want to see a rising group of people who are suicidal and feel that their life is really not worth living, search out the work being done on detransitioners who realise too late that that affirmation by adults has been a disaster for them.
On the amendments suggesting another advisory committee with experts to advise Ofcom on how we regulate such harms, I ask that we are at least cautious about which experts. To mention one of the expert bodies, Mermaids has become controversial and has actually been advocating some of those self-harms, in my opinion. It is now subject to a Charity Commission investigation but has been on bodies such as this advising about young people. I would not think that appropriate, so I just ask that some consideration is given to which experts would be on such bodies.
My Lords, I also support the amendments in the name of my noble friend Lady Finlay. I want to address a couple of issues raised by the noble Lord, Lord Allan. He made a fantastic case for adequate redress systems, both at platform level and at independent complaint level, to really make sure that, at the edge of all the decisions we make, there is sufficient discussion about where that edge lies.
The real issue is not so much the individuals who are in recovery and seeking to show leadership but those who are sent down the vortex of self-harm and suicide material that comes in its scores—in its hundreds and thousands—and completely overwhelms them. We must not make a mistake on the edge case and not deal with the issue at hand.
There is absolutely not enough signposting. I have seen at first hand—I will not go through it again; I have told the Committee already—companies justifying material that it was inconceivable to justify as being a cry for help. A child with cuts and blood running down their body is not a cry for help; that is self-harm material.
From experience, I think it is true that companies get defensive and seek to defend the indefensible on occasion. I agree with the noble Baroness on that, but I will balance it a little as I also work with people who were agonising over not wanting to make a bad situation worse. They were genuinely struggling and seeking to do the right thing. That is where the experts come in. If someone would say to them, “Look, take this stuff down; that is always better”, it would make their lives easier. If they said, “Please leave it up”, they could follow that advice. Again, that would make their lives easier. On the excuses, I agree that sometimes they are defending the indefensible, but also there are people agonising over the right thing to do and we should help them.
I absolutely agree. Of course, good law is a good system, not a good person.
I turn to the comments that I was going to make. Uncharacteristically, I am a little confused about this issue and I would love the Minister’s help. My understanding on reading the Bill very closely is that self-harm and suicide content that meets a legal definition will be subject to the priority illegal content duties. In the case of children, we can safely anticipate that content of this kind will be named primary priority content. Additionally, if such content is against the terms of service of a regulated company, it can be held responsible to those terms. It will have to provide a user empowerment tool on category 1 services so that it can be toggled out if an adult user wishes. That is my understanding of where this content has already been dealt with in the Bill. To my mind, this leaves the following ways in which suicide and self-harm material, which is the subject of this group of amendments, is not covered by the Bill. That is what I would like the Minister to confirm, and I absolutely stand by to be corrected.
In the case of adults, if self-harm and suicide material does not meet a bar of illegal content and the service is not category 1, there is no mechanism to toggle it out. Ofcom has no power to require a service to ensure tools to toggle self-harm and suicide material out by default. This means that self-harm and suicide material can be as prevalent as they like—pushed, promoted and recommended, as I have just explained—if it is not contrary to the terms of service, so long as it does not reach the bar of illegal content.
Search services are not subject to these clauses— I am unsure about that. In the case of both children and adults, if self-harm and suicide material is on blogs or services with limited functionality, it is out of scope of the Bill and there is absolutely nothing Ofcom can do. For non-category 1 services—the majority of services which claim that an insignificant number of children access their site and thus that they do not have to comply with the child safety duties—there are no protections for a child against this content.
I put it like that because I believe that each of the statements I just made could have been fixed by amendments already discussed during the past six days in Committee. We are currently planning to leave many children without the protection of the safety duties, to leave vulnerable adults without even the cover of default protections against material that has absolutely no public interest and to leave companies to decide whether to promote or use this material to fuel user engagement—even if it costs well-being and lives.
I ask the Minister to let me know if I have misunderstood, but I think it is really quite useful to see what is left once the protections are in place, rather than always concentrating on the protections themselves.
My Lords, I support the noble Baroness, Lady Finlay of Llandaff, in her Amendment 96 and others in this group. The internet is fuelling an epidemic of self-harm, often leading to suicide among young people. Thanks to the noble Baroness, Lady Kidron, I have listened to many grieving families explaining the impact that social media had on their beloved children. Content that includes providing detailed instructions for methods of suicide or challenges or pacts that seek agreement to undertake mutual acts of suicide or deliberate self-injury must be curtailed, or platforms must be made to warn and protect vulnerable adults.
I recognise that the Government acknowledge the problem and have attempted to tackle it in the Bill with the new offence of encouraging or assisting serious self-harm and suicide and by listing it as priority illegal content. But I agree with charities such as Samaritans, which says that the Government are taking a partial approach by not accepting this group of amendments. Samaritans considers that the types of suicide and self-harm content that is legal but unequivocally harmful includes information, depictions, instructions and advice on methods of self-harm or suicide, content that portrays self-harm and suicide as positive or desirable and graphic descriptions or depictions of self-harm and suicide.
With the removal of regulation of legal but harmful content, much suicide and self-harm content can remain easily available, and platforms will not even need to consider the risk that such content could pose to adult users. These amendments aim to ensure that harmful self-harm and suicide content is addressed across all platforms and search services, regardless of their functionality or reach, and, importantly, for all persons regardless of age.
In 2017 an inquiry into suicides of young people found suicide-related internet use in 26% of deaths in under-20s and 13% of deaths in 20 to 24 year-olds. Three-quarters of people who took part in Samaritans’ research with Swansea University said that they had harmed themselves more severely after viewing self-harm content online, as the noble Baroness, Lady Finlay, pointed out. People of all ages can be susceptible to harm from this dangerous content. There is shocking evidence that between 2011 and 2015, 151 patients who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm, and 82% of those patients were over 25.
Suicide is complex and rarely caused by one thing. However, there is strong evidence of associations between financial difficulties, mental health and suicide. People on the lowest incomes have a higher suicide risk than those who are wealthier, and people on lower incomes are also the most affected by rising prices and other types of financial hardship. In January and February this year the Samaritans saw the highest percentage of first-time phone callers concerned about finance or unemployment—almost one in 10 calls for help in February. With the cost of living crisis and growing pressure on adults to cope with stress, it is imperative that the Government urgently bring in these amendments to help protect all ages from harmful suicide and self-harm content by putting a duty on providers of user-to-user services to properly manage such content.
A more comprehensive online safety regime for all ages will also increase protections for children, as research has shown that age verification and restrictions across social media and online platforms are easily bypassed by them. As the Bill currently stands, there is a two-tier approach to safety which can still mean that children may circumnavigate safety controls and find this harmful suicide and self-harm content.
Finally, user empowerment duties that we debated earlier are no substitute for regulation of access to dangerous suicide and self-harm online content through the law that these amendments seek to achieve.
My Lords, I thank the noble Baroness, Lady Finlay, for introducing the amendments in the way she did. I think that what she has done, and what this whole debate has done, is to ask the question that the noble Baroness, Lady Kidron, posed: we do not know yet quite where the gaps are until we see what the Government have in mind in terms of the promised new offence. But it seems pretty clear that something along the lines of what has been proposed in this debate needs to be set out as well.
One of the most moving aspects of being part of the original Joint Committee on the draft Bill was the experience of listening to Ian Russell and the understanding, which I had not come across previously, of the sheer scale of the kind of material that has been the subject of this debate on suicide and self-harm encouragement. We need to find an effective way of dealing with it and I entirely take my noble friend’s point that this needs a combination of protectiveness and support. I think the combination of these amendments is designed to do precisely that and to learn from experience through having the advisory committee as well.
It is clear that, by itself, user empowerment is just not going to be enough in all of this. I think that is the bottom line for all of us. We need to go much further, and we owe a debt to the noble Baroness, Lady Finlay, for raising these issues and to the Samaritans for campaigning on this subject. I am just sorry that my noble friend Lady Tyler cannot be here because she is a signatory to a number of the amendments and feels very strongly about these issues as well.
I do not think I need to unpack a great deal of the points that have been made. We know that suicide is a leading cause of death in males under 50 and females under 35 in the UK. We know that so many of the deaths are internet-related and we need to find effective methods of dealing with this. These are meant to be practical steps.
I take the point of the noble Baroness, Lady Fox, not only that it is a social problem of some magnitude but that the question of definitions is important. I thought she strayed well beyond where I thought the definition of “self-harm” actually came. But one could discuss that. I thought the noble Baroness, Lady Kidron, saying that we want good law, not relying on good people, was about definitions. We cannot just leave it to the discretion of an individual, however good they may be, moderating on a social media platform.
Along with the Samaritans, I very much regret that we no longer have the legal but harmful category, which would help guide us in this area. I share its view that we need to protect people of all ages from all extremely dangerous suicide and self-harm content on large and small platforms. I think this is a way of doing it. The type of content can be perhaps more tightly defined in terms of the kinds of information, depictions, instructions and the kinds of content, the portrayal and the graphic descriptions that occur. One perhaps might be able to do more in that direction. I very much hope that we can move further today with some assurance from the Minister in this area.
The establishment of a specific unit within Ofcom, which was the subject of the Private Member’s Bill of the noble Baroness, Lady Finlay, is potentially a very useful addition to the Bill. I very much hope that the Minister takes that on board as well.
This has been a very good debate indeed. I have good days and bad days in Committee. Good days are when I feel that the Bill is going to make a difference and things are going to improve and the sun will shine. Bad days are a bit like today, where we have had a couple of groups, and this is one of them, where I am a bit worried about where we are and whether we have enough—I was going to use that terrible word “ammunition” but I do not mean that—of the powers that are necessary in the right place and with the right focus to get us through some of the very difficult questions that come in. I know that bad cases make bad law, but they can also illustrate why the law is not good enough. As the noble Baroness, Lady Kidron, was saying, this is possibly one of the areas we are in.
The speeches in the debate have made the case well and I do not need to go back over it. We have got ourselves into a situation where we want to reduce harm that we see around but do not want to impact freedom of expression. Both of those are so important and we have to hold on to them, but we find ourselves struggling. What do we do about that? We think through what we will end up with this Bill on the statute book and the codes of practice through it. This looks as though it is heading towards the question of whether the terms of service that will be in place will be sufficient and able to restrict the harms we will see affecting people who should not be affected by them. But I recognise that the freedom of expression arguments have won the day and we have to live with that.
The noble Baroness, Lady Kidron, mentioned the riskiness of the smaller sites—categories 2A and 2B and the ones that are not even going to be categorised as high as that. Why are we leaving those to cause the damage that they are? There is something not working here in the structure of the Bill and I hope the Minister will be able to provide some information on that when he comes to speak.
Obviously, if we could find a way of expressing the issues that are raised by the measures in these amendments as being illegal in the real world, they would be illegal online as well. That would at least be a solution that we could rely on. Whether it could be policed and serviced is another matter, but it certainly would be there. But we are probably not going to get there, are we? I am not looking at the Minister in any hope but he has a slight downward turn to his lips. I am not sure about this.
How can we approach a legal but harmful issue with the sort of sensitivity that does not make us feel that we have reduced people’s ability to cope with these issues and to engage with them in an adult way? I do not have an answer to that.
Is this another amplification issue or is it deeper and worse than that? Is this just the internet because of its ability to focus on things to keep people engaged, to make people stay online when they should not, to make them reach out and receive material that they ought not to get in a properly regulated world? Is it something that we can deal with because we have a sense of what is moral and appropriate and want to act because society wants us to do it? I do not have a solution to that, and I am interested to hear what the Minister will say, but I think it is something we will need to come back to.
My Lords, like everyone who spoke, I and the Government recognise the tragic consequences of suicide and self-harm, and how so many lives and families have been devastated by it. I am grateful to the noble Baroness and all noble Lords, as well as the bereaved families who campaigned so bravely and for so long to spare others that heartache and to create a safer online environment for everyone. I am grateful to the noble Baroness, Lady Finlay of Llandaff, who raised these issues in her Private Member’s Bill, on which we had exchanges. My noble friend Lady Morgan is right to raise the case of Frankie Thomas and her parents, and to call that to mind as we debate these issues.
Amendments 96 and 296, tabled by the noble Baroness, Lady Finlay, would, in effect, reintroduce the former adult safety duties whereby category 1 companies were required to assess the risk of harm associated with legal content accessed by adults, and to set and enforce terms of service in relation to it. As noble Lords will know, those duties were removed in another place after extensive consideration. Those provisions risked creating incentives for the excessive removal of legal content, which would unduly interfere with adults’ free expression.
However, the new transparency, accountability and freedom of expression duties in Part 4, combined with the illegal and child safety duties in Part 3, will provide a robust approach that will hold companies to account for the way they deal with this content. Under the Part 4 duties, category 1 services will need to have appropriate systems and processes in place to deal with content or activity that is banned or restricted by their terms of service.
Many platforms—such as Twitter, Facebook and TikTok, which the noble Baroness raised—say in their terms of service that they restrict suicide and self-harm content, but they do not always enforce these policies effectively. The Bill will require category 1 companies—the largest platforms—fully to enforce their terms of service for this content, which will be a significant improvement for users’ safety. Where companies allow this content, the user-empowerment duties will give adults tools to limit their exposure to it, if they wish to do so.
The noble Baroness is right to raise the issue of algorithms. As the noble Lord, Lord Stevenson, said, amplification lies at the heart of many cases. The Bill will require providers specifically to consider as part of their risk assessments how algorithms could affect children’s and adults’ exposure to illegal content, and content that is harmful to children, on their services. Providers will need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet the illegal content and child safety duties in the Bill.
Following our earlier discussion, we were going to have a response on super-complaints. I am curious to understand whether we had a pattern of complaints—such as those the noble Baroness, Lady Kidron, and others received—about a platform saying, under its terms of service, that it would remove suicide and self-harm content but failing to do so. Does the Minister think that is precisely the kind of thing that could be substantive material for an organisation to bring as a super-complaint to Ofcom?
My initial response is, yes, I think so, but it is the role of Ofcom to look at whether those terms of service are enforced and to act on behalf of internet users. The noble Lord is right to point to the complexity of some marginal cases with which companies have to deal, but the whole framework of the Bill is to make sure that terms of service are being enforced. If they are not, people can turn to Ofcom.
The Minister is going through the structure of the Bill and saying that what is in it is adequate to prevent the kinds of harms to vulnerable adults that we talked about during this debate. Essentially, it is a combination of adherence to terms of service and user-empowerment tools. Is he saying that those two aspects are adequate to prevent the kinds of harms we have talked about?
Yes, they are—with the addition of what I am coming to. In addition to the duty for companies to consider the role of algorithms, which I talked about, Ofcom will have a range of powers at its disposal to help it assess whether providers are fulfilling their duties, including the power to require information from providers about the operation of their algorithms. The regulator will be able to hold senior executives criminally liable if they fail to ensure that their company is providing Ofcom with the information it requests.
However, we must not restrict users’ right to see legal content and speech. These amendments would prescribe specific approaches for companies’ treatment of legal content accessed by adults, which would give the Government undue influence in choosing, on adult users’ behalf, what content they see—
I wanted to give the Minister time to get on to this. Can we now drill down a little on the terms of service issue? If the noble Baroness, Lady Kidron, is right, are we talking about terms of service having the sort of power the Government suggest in cases where they are category 1 and category 2A but not search? There will be a limit, but an awful lot of other bodies about which we are concerned will not fall into that situation.
Also, I thought we had established, much to our regret, that the terms of service were what they were, and that Ofcom’s powers—I paraphrase to make the point—were those of exposure and transparency, not setting minimum standards. But even if we are talking only about the very large and far-reaching companies, should there not be a power somewhere to engage with that, with a view getting that redress, if the terms of service do not specify it?
The Bill will ensure that companies adhere to their terms of service. If they choose to allow content that is legal but harmful on their services and they tell people that beforehand—and adults are able and empowered to decide what they see online, with the protections of the triple shield—we think that that strikes the right balance. This is at the heart of the whole “legal but harmful” debate in another place, and it is clearly reflected throughout the approach in the Bill and in my responses to all of these groups of amendments. But there are duties to tackle illegal content and to make sure that people know the terms of service for the sites they choose to interact with. If they feel that they are not being adhered to—as they currently are not in relation to suicide and self-harm content on many of the services—users will have the recourse of the regulator to turn to.
I think that noble Lords are racing ahead a little bit in being pessimistic about the work of Ofcom, which will be proactive in its supervisory role. That is a big difference from the status quo, in terms of the protection for users. We want to strike the right balance to make sure that we are enforcing terms of service while protecting against the arbitrary removal of legal content, and the Bill provides companies with discretion about how to treat that sort of content, as accessed by their users. However, we agree that, by its nature, this type of content can be very damaging, particularly for vulnerable young people, which is why the Government remain committed to introducing a new criminal offence of content that encourages or promotes serious self-harm. The new offence will apply to all victims, children as well as adults, and will be debated once it is tabled; we will explore these details a bit more then. The new law will sit alongside companies’ requirements to tackle illegal suicide content, including material that encourages or assists suicide under the terms of the Suicide Act 1961.
The noble Baronesses, Lady Finlay and Lady Kidron, asked about smaller websites and fora. We are concerned about the widespread availability of content online which promotes or advertises methods of suicide and self-harm, and which can easily be accessed by people who are young or vulnerable. Where suicide and self-harm websites host user-generated content, they will be in scope of the Bill. Those sites will need proactively to prevent users being exposed to priority illegal content, including content that encourages or assists suicide, as set out in the 1961 Act.
The noble Baroness asked about the metaverse, which is in scope of the Bill as a user-to-user service. The approach of the Bill is to try to remain technology neutral.
I will plant a flag in reference to the new offences, which I know we will come back to again. It is always helpful to look at real-world examples. There is a lot of meme-based self-harm content. Two examples are the Tide Pods challenge—the eating of detergent capsules—and choking games, both of which have been very common and widespread. It would be helpful, ahead of our debate on the new offences, to understand whether they are below or above the threshold of serious self-harm and what the Government’s intention is. There are arguments both ways: obviously, criminalising children for being foolish carries certain consequences, but we also want to stop the spread of the content. So, when we come to that offence, it would be helpful if the Minister could use specific examples, such as the meme-based self-harm content, which is quite common.
I thank the noble Lord for the advance notice to think about that; it is helpful. It is difficult to talk in general terms about this issue, so, if I can, I will give examples that do, and do not, meet the threshold.
The Bill goes even further for children than it does for adults. In addition to the protections from illegal material, the Government have indicated, as I said, that we plan to designate content promoting suicide, self-harm or eating disorders as categories of primary priority content. That means that providers will need to put in place systems designed to prevent children of any age encountering this type of content. Providers will also need specifically to assess the risk of children encountering it. Platforms will no longer be able to recommend such material to children through harmful algorithms. If they do, Ofcom will hold them accountable and will take enforcement action if they break their promises.
It is right that the Bill takes a different approach for children than for adults, but it does not mean that the Bill does not recognise that young adults are at risk or that it does not have protections for them. My noble friend Lady Morgan was right to raise the issue of young adults once they turn 18. The triple shield of protection in the Bill will significantly improve the status quo by protecting adults, including young adults, from illegal suicide content and legal suicide or self-harm content that is prohibited in major platforms’ terms and conditions. Platforms also have strong commercial incentives, as we discussed in previous groups, to address harmful content that the majority of their users do not want to see, such as legal suicide, eating disorder or self-harm content. That is why they currently claim to prohibit it in their terms and conditions, and why we want to make sure that those terms and conditions are transparently and accountably enforced. So, while I sympathise with the intention from the noble Baroness, Lady Finlay, her amendments raise some wider concerns about mandating how providers should deal with legal material, which would interfere with the careful balance the Bill seeks to strike in ensuring that users are safer online without compromising their right to free expression.
The noble Baroness’s Amendment 240, alongside Amendment 225 in the name of the noble Lord, Lord Stevenson, would place new duties on Ofcom in relation to suicide and self-harm content. The Bill already has provisions to provide Ofcom with broad and effective information-gathering powers to understand how this content affects users and how providers are dealing with it. For example, under Clause 147, Ofcom can already publish reports about suicide and self-harm content, and Clauses 68 and 69 empower Ofcom to require the largest providers to publish annual transparency reports.
Ofcom may require those reports to include information on the systems and processes that providers use to deal with illegal suicide or self-harm content, with content that is harmful to children, or with content which providers’ own terms of service prohibit. Those measures sit alongside Ofcom’s extensive information-gathering powers. It will have the ability to access the information it needs to understand how companies are fulfilling their duties, particularly in taking action against this type of content. Furthermore, the Bill is designed to provide Ofcom with the flexibility it needs to respond to harms—including in the areas of suicide, self-harm and eating disorders—as they develop over time, in the way that the noble Baroness envisaged in her remarks about the metaverse and new emerging threats. So we are confident that these provisions will enable Ofcom to assess this type of content and ensure that platforms deal with it appropriately. I hope that this has provided the sufficient reassurance to the noble Baroness for her not to move her amendment.
My Lords, I am extremely grateful to everyone who has contributed to this debate. It has been a very rich debate, full of information; my notes have become extensive during it.
There are a few things that I would like to know more about: for example, how self-harm, which has been mentioned by the Minister, is being defined, given the debate we have had about how to define self-harm. I thought of self-harm as something that does lasting and potentially life-threatening damage. There are an awful lot of things that people do to themselves that others might not like them doing but that do not fall into that category. However, the point about suicide and serious self-harm is that when you are dead, that is irreversible. You cannot talk about healing, because the person has now disposed of their life, one way or another.
I am really grateful to the noble Baroness, Lady Healy, for highlighting how complex suicide is. Of course, one of the dangers with all that is on the internet is that the impulsive person gets caught up rapidly, so what would have been a short thought becomes an overwhelming action leading to their death.
Having listened to the previous debate, I certainly do not understand how Ofcom can have the flexibility to really know what is happening and how the terms of service are being implemented without a complaints system. I echo the really important phrase from the noble Lord, Lord Stevenson of Balmacara: if it is illegal in the real world, why are we leaving it on the internet?
Many times during our debates, the noble Baroness, Lady Kidron, has pushed safety by design. In many other things, we have defaults. My amendments were not trying to provide censorship but simply trying to provide a default, a safety stop, to stop things escalating, because we know that they are escalating at the moment. The noble Lord, Lord Stevenson of Balmacara, asked whether it was an amplification or a reach issue. I add, “or is it both?”. From all the evidence we have before us, it appears to be.
I am very grateful to the noble Lord, Lord Clement-Jones, for pressing that we must learn from experience and that user empowerment to switch off simply does not go far enough: people who are searching for this and already have suicidal ideation will not switch it off because they have started searching. There is no way that could be viewed as a safety feature in the Bill, and it concerns me.
Although I will withdraw my amendment today, of course, I really feel that we will have to return to this on Report. I would very much appreciate the wisdom of other noble Lords who know far more about working on the internet and all the other aspects than I do. I am begging for assistance in trying to get the amendments right. If not, the catalogue of deaths will mount up. This is literally a once-in-a-lifetime opportunity. For the moment, I beg leave to withdraw.
Amendment 96 withdrawn.
Clause 36: Codes of practice about duties
Amendment 96A not moved.
97: Clause 36, page 36, line 42, at end insert “including a code of practice describing measures for the purpose of compliance with the relevant duties so far as relating to violence against women and girls.”
Member’s explanatory statement
This amendment would impose an express obligation on OFCOM to issue a code of practice on violence against women and girls rather than leaving it to OFCOM’s discretion. This would ensure that Part 3 providers recognise the many manifestations of online violence, including illegal content, that disproportionately affect women and girls.
My Lords, it is a great pleasure to move Amendment 97 and speak to Amendment 304, both standing in my name and supported by the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester and the noble Lord, Lord Knight of Weymouth. I am very grateful for their support. I look forward to hearing the arguments by the noble Lord, Lord Stevenson, for Amendment 104 as well, which run in a similar vein.
These amendments are also supported by the Domestic Abuse Commissioner, the Revenge Porn Helpline, BT, EE and more than 100,000 UK citizens who have signed End Violence Against Women’s petition urging the Government to better protect women and girls in the Bill.
I am also very grateful to the noble Baroness, Lady Foster of Aghadrumsee—I know I pronounced that incorrectly—the very distinguished former Northern Ireland politician. She cannot be here to speak today in favour of the amendment but asked me to put on record her support for it.
I also offer my gratitude to the End Violence Against Women Coalition, Glitch, Refuge, Carnegie UK, NSPCC, 5Rights, Professor Clare McGlynn and Professor Lorna Woods. Between them all, they created the draft violence against women and girls code of practice many months ago, proving that a VAWG code of practice is not only necessary but absolutely deliverable.
Much has already been said on this, both here and outside the Chamber. In the time available, I will focus my case for these amendments on two very specific points. The first is why VAWG, violence against women and girls, should have a specific code of practice legislated for it, rather than other content we might debate. The second is what having a code of practice means in relation to the management of that content.
Ofcom has already published masses of research showing that abuse online is gendered. The Government’s own fact sheet, sent to us before these debates, said that women and girls experience disproportionate levels of abuse online. They experience a vast array of abuse online because of their gender, including cyberflashing, harassment, rape threats and stalking. As we have already heard and will continue to hear in these debates, some of those offences and abuse reach a criminal threshold and some do not. That is at the heart of this debate.
The first death threat that I received—I have received a number, sadly, both to me and to my family—did not talk about death or dying. It said that I was going to be “Jo Coxed”. Of course, I reported that to Twitter and the AI content moderator. Because it did not have those words in it, it was not deemed to be a threat. It was not until I could speak to a human being—in this case, the UK public affairs manager of Twitter, to whom I am very grateful—that it even started to be taken seriously.
The fear of being harassed is impacting women’s freedom of speech. The Fawcett Society has found that 73% of female MPs, versus 51% of male MPs, say that they avoid speaking online in certain discussions because of fear of the consequences of doing so. Other women in the public eye, such as the presenter Karen Carney, have also been driven offline due to gendered abuse.
Here is the thing I cannot reconcile with the government response on this so far. This Government have absolutely rightly recognised that violence against women and girls is a national threat. They have made it a part of the strategic policing requirement. If tackling online abuse against women and girls is a priority, as the Government say, and if, as in the stated manifesto commitment of 2019, they want the UK to be
“the safest place in the world to be online”,
why are the words “women and girls” not used once in the 262 pages of the current draft of the Bill?
The Minister has said that changes have been made in the other House on the Bill—I understand that—and that it is now focused more on the protection of children in relation to certain content, whereas adults are deemed to be able to choose more what they see and how they see it. But there is a G in VAWG, for girls. The code of practice that we are talking about would benefit that very group of people—young girls, who are children—whom the Government have said that they really want to protect through the Bill.
Online harassment does not affect only women in the public eye but all women. I suspect that we all now know the statistic that women are 27 times more likely to be harassed online than men. In other words, to have an online presence as a woman is to expect harassment. That is not to say that men do not face abuse online, but a lot of the online abuse is deliberately gendered and is targeted at women. Do men receive rape threats on the same vast scale as women and young girls?
It should not be the public’s job to force a platform to act on the harmful content that it is hosting, just as it should not be a woman’s job to limit her online presence to prevent harassment. But the sad reality is that, in its current form, the Bill is doing very little to force platforms to act holistically in relation to violence against women and girls and to change their culture online.
The new VAWG-relevant criminal offences listed in the Bill—I know that my noble friend the Minister will rely on these in his response to the debate—including cyberflashing and coercive and controlling behaviour, are an important step, but even these new offences have their own issues, which I suspect we will come on to debate in the next day of Committee: for example, cyberflashing being motive-based instead of consent-based. Requiring only those platforms caught by the Bill to look at the criminal offences individually ignores the rest of the spectrum of gendered abuse.
Likewise, the gender-neutral approach in the Bill will harm children. NSPCC research found that in 2021-22, four in five victims of online grooming offences were girls. The Internet Watch Foundation, an organisation we are going to talk about in the next group, has found in recently published statistics that girls are more likely to be seriously abused online. I have already stated that this is not to say that boys and men do not experience abuse online, but the fact is that women and girls are several times more likely to be abused. This is not an argument against free speech; people online should be allowed to debate and disagree with each other, but discussions can and should be had without the threat of rape or harassment.
Again, the Government will argue that the triple-shield approach to combating legal but harmful content online will sufficiently protect women and girls, but this is not the case. Instead of removing this content, the Bill’s user empowerment tools—much debated already—expect women to shield themselves from seeing it. All this does is allow misogynistic and often violent conversations to continue without women knowing about them, the result of which can be extremely dangerous. A victim of domestic abuse could indeed block the user threatening to kill them, but that does not stop that user from continuing to post the threats he is making, or even posting photos of the victim’s front door. Instead of protecting the victim, these tools potentially leave them even more vulnerable to real-life harms. Likewise, the triple shield will rely too heavily on platforms setting their own terms and conditions. We have just heard my noble friend the Minister using this argument in the last group, but the issue is that the platforms can choose to water down their terms and conditions, and Ofcom is then left without recourse.
I turn to what a violence against women and girls code of practice would mean. It could greatly reduce all the dangers I have just laid out. It would ensure that services regularly review their algorithms to stop misogyny going viral, and that moderators are taught, for example, how to recognise different forms of online violence against women and girls, including forms of tech abuse. Ofcom has described codes of practice as
“key documents, which set out the steps services can take to comply with their duties”.
Services can choose to take an alternative approach to complying with their duties, provided that it is consistent with the duties in the Bill, but codes will provide a clear route to compliance, and Ofcom envisages that many services will therefore take advantage of them.
The value of having a code lies in its systemic approach. It does not focus on individual items of content—which is one of the worries that have been expressed, both in this House and outside—but it focuses the platforms’ minds on the whole environment in which the tech-enabled abuse occurs. The code of practice would make the UK the first country in the world to hold tech companies specifically to account on tackling violence against women and girls. It would also make the Online Safety Bill more future-proof, because it would provide a proactive and agile route for identifying and problem-solving new forms of online VAWG as they emerge, rather than delaying action until the creation of a new criminal offence when the next relevant piece of primary legislation comes along.
I finish by saying that throughout the Bill’s journey through Parliament, we have debated whether it sufficiently protects women and girls. The objective answer is, “No, it does not”, but there appears to be a real reluctance to accept this as fact. Instead of just agreeing to disagree on this topic, we instead have an opportunity here to protect millions of women and girls online with a violence against women and girls code of practice. So I ask noble Lords to support this critical amendment, not just for the sake of themselves, their daughters, their sisters or their wives but for the sake of the millions of women whose names we will never know but who will be grateful that we stood on their side on the issue of gendered online violence. I beg to move.
My Lords, I have added my name to Amendments 97 and 304, and I wholeheartedly agree with all that the noble Baroness, Lady Morgan, said by means of her excellent introduction. I look forward to hearing what the noble Baroness, Lady Kidron, has to say as she continues to bring her wisdom to the Bill.
Let me say from the outset, if it has not been said strongly enough already, that violence against women and girls is an abomination. If we allow a culture of intimidation and misogyny to exist online, it will spill over to offline experiences. According to research by Refuge, almost one in five domestic abuse survivors who experienced abuse or harassment from their partner or former partner via social media said they felt afraid of being attacked or being subjected to physical violence as a result. Some 15% felt that their physical safety was more at risk, and 5% felt more at risk of so-called honour-based violence. Shockingly, according to Amnesty International, 41% of women who experienced online abuse or harassment said that these experiences made them feel that their physical safety was threatened.
Throughout all our debates, I hesitate to differentiate between the real and virtual worlds, because that is simply not how we live our lives. Interactions online are informed by face-to-face interactions, and vice versa. To think otherwise is to misunderstand the lived experience of the majority—particularly, dare I say, the younger generations. As Anglican Bishop for HM Prisons, I recognise the complexity of people’s lives and the need to tackle attitudes underpinning behaviours. Tackling the root causes of offending should always be a priority; there is potential for much harm later down the line if we ignore warning signs of hatred and misogyny. Research conducted by Refuge found that one in three women has experienced online abuse or harassment perpetrated on social media or another online platform at some point in their lives. That figure rises to almost two in three, or 62%, among young women. This must change.
We did some important work in your Lordships’ House during the passage of the Domestic Abuse Act to ensure that all people, including women and girls, are safe on our streets and in their homes. As has been said, introducing a code of practice as outlined will help the Government meet their aim of making the UK the safest place in the world to be online, and it will align with the Government’s wider priority to tackle violence against women and girls as a strategic policing requirement. Other strategic policing requirements, including terrorism and child sexual exploitation, have online codes of practice, so surely it follows that there should be one for VAWG to ensure that the Bill aligns with the Government’s position elsewhere and that there is not a gap left online.
I know the Government care deeply about tackling violence against women and girls, and I believe they have listened to some concerns raised by the sector. The inclusion of the domestic abuse and victims’ commissioners as statutory consultees is welcomed, as is the Government’s amendment to recognise controlling and coercive behaviour as a priority offence. However, without this code of conduct, the Bill will fail to address duties of care in relation to preventing domestic abuse and violence against women and girls in a holistic and encompassing way. The onus should not be on women and girls to remove themselves from online spaces; we have seen plenty of that in physical spaces over the years. Women and girls must be free to appropriately express themselves online and offline without fear of harassment. We must do all we can to prevent expressions of misogyny from transforming into violent actions.
My Lords, I have added my name to Amendments 97 and 304, and I support the others in this group. It seems to be a singular failure of any version of an Online Safety Bill if it does not set itself the task of tackling known harms—harms that are experienced daily and for which we have a phenomenal amount of evidence. I will not repeat the statistics given in the excellent speeches made by the noble Baroness, Lady Morgan, and the right reverend Prelate, but will instead add two observations.
I am of the generation that saw women break gender barriers and glass ceilings. We were ourselves the beneficiaries of the previous generation, which both intellectually and practically pushed the cause of gender equality. Many of us are also employers, mothers, aunties or friends of a generation in which the majority of the young favour a more gender-equal world. Yet we have seen online the amplification of those who hold a profound resentment of what I would characterise as hard-won and much-needed progress. The vileness and violence of their fury is fuelled by the rapacious appetite of algorithms that profit from engagement, which has allowed gender-based detractors, haters and abusers to normalise misogyny to such a degree that rape threats and threats of violence are trotted out against women for the mildest of perceived infractions. In the case of an academic colleague of mine, her crime—for which she received rape threats—was offering a course in women’s studies.
If the price of having a voice online continues to be that you have to withstand a supercharged swarm of abuse then for many women it is simply not worth it. As the noble Baroness, Lady Morgan, said, they are effectively silenced. This sadly extends to girls, who repeatedly say that, as the statistics persistently show, they are put off any kind of public role and even expressing a view because they fear both judgment on how they look and abuse for what they say. How heart- breaking it is that the organising technology of our time is so regressive for women and girls that the gains we have made in our lifetime are being denied them. This is why I believe that Parliament must be clear that an environment in which women and girls are routinely silenced or singled out for abuse is not okay.
My second observation is slightly counterintuitive, because I so wish for these amendments to find their way into the Bill. I have a sense of disquiet that there will be no similar consideration of other exposed or vulnerable groups that are less well-represented in Parliament. I therefore want to take this opportunity to say once again that we have discussed in our debates on previous groups amendments that would commit the Bill to the Equality Act 2010, with the expectation that companies will adhere to UK law across all groups with protected characteristics, including those who may have more than one protected characteristic, and take note that—this point has been made in a number of briefings—women with disabilities and mixed or global-majority heritage come in for double, sometimes triple, doses of abuse. In saying that, I wish to acknowledge that the amendments in the name of the noble Baroness, Lady Fox, which make it clear that the discussion of protected characteristics does not in and of itself constitute harm. I very much agree with her on that.
Perhaps this is a good moment to remember that the Bill is proposed as a systems and processes regime—no single piece of content will be at stake but rather, if a company is amplifying and promoting at scale behaviours that hound women and girls out of the public space, Ofcom will have the tools to deal with it. At the risk of repeating myself, these are not open spaces; they are 100% engineered and largely privately owned. I fail to see another environment in which it is either normal or lawful to swarm women with abuse and threat.
On our first day in Committee, the Minister said in his response to the amendment in the name of the noble Lord, Lord Stevenson, that the Government are very clear on the Bill’s purposes. Among the list of purposes that he gave was
“to protect people who face disproportionate harm online including, for instance, because of their sex or their ethnicity or because they are disabled”.—[Official Report, 19/4/23; col. 724.]
I ask the Minister to make this Bill come true on that purpose.
My Lords, I strongly support Amendment 97 in the name of the noble Baroness, Lady Morgan. We must strengthen the Bill by imposing an obligation on Ofcom to develop and issue a code of practice on violence against women and girls. This will empower Ofcom and guide services in meeting their duties in regard to women and girls, and encourage them to recognise the many manifestations of online violence that disproportionately affect women and girls.
Refuge, the domestic abuse charity, has seen a growing number of cases of technology-facilitated domestic abuse in recent years. As other noble Lords have said, this tech abuse can take many forms but social media is a particularly powerful weapon for perpetrators, with one in three women experiencing online abuse, rising to almost two in three among young women. Yet the tech companies have been too slow to respond. Many survivors are left waiting weeks or months for a response when they report abusive content, if indeed they receive one at all. It appears that too many services do not understand the risks and nature of VAWG. They do not take complaints seriously and they think that this abuse does not breach community standards. A new code would address this with recommended measures and best practice on the appropriate prevention of and response to violence against women and girls. It would also support the delivery of existing duties set out in the Bill, such as those on illegal content, user empowerment and child safety.
I hope the Minister can accept this amendment, as it would be in keeping with other government policies, such as in the strategic policing requirement, which requires police forces to treat violence against women and girls as a national threat. Adding this code would help to meet the Government’s national and international commitments to tackling online VAWG, such as the tackling VAWG strategy and the Global Partnership for Action on Gender-Based Online Harassment and Abuse.
The Online Safety Bill is a chance to act on tackling the completely unacceptable levels of abuse of women and girls by making it clear through Ofcom that companies need to take this matter seriously and make systemic changes to the design and operation of their services to address VAWG. It would allow Ofcom to add this as a priority, as mandated in the Bill, rather than leave it as an optional extra to be tackled at a later date. The work to produce this code has already been done thanks to Refuge and other charities and academics who have produced a model that is freely available and has been shared with Ofcom. So it is not an extra burden and does not need to delay the implementation of the Bill; in fact, it will greatly aid Ofcom.
The Government are to be congratulated on their amendment to include controlling or coercive behaviour in their list of priority offences. I would like to congratulate them further if they can accept this valuable Amendment 97.
My Lords, I start by commending my noble friend Lady Morgan on her clear introduction to this group of amendments. I also commend the noble Baroness, Lady Kidron, on her powerful speech.
From those who have spoken so far, we have a clear picture of the widespread nature of some of the abuse and offences that women experience when they go online. I note from what my noble friend Lady Morgan said that there is widespread support from a range of organisations outside the Committee for this group of amendments. She also made an important and powerful point about the potential chilling effect of this kind of activity on women, including women in public life, being able to exercise their right to freedom of expression.
I feel it is important for me to make it clear that—this is an obvious thing—I very much support tough legal and criminal sanctions against any perpetrator of violence or sexual abuse against women. I really do understand and support this, and hear the scale of the problem that is being outlined in this group of amendments.
Mine is a dissenting voice, in that I am not persuaded by the proposed solution to the problem that has been described. I will not take up a lot of the Committee’s time, but any noble Lords who were in the House when we were discussing a group of amendments on another piece of legislation earlier this year may remember that I spoke against making misogyny a hate crime. The reason why I did that then is similar, in that I feel somewhat nervous about introducing a code of conduct which is directly relevant to women. I do not like the idea of trying to address some of these serious problems by separating women from men. Although I know it is not the intention of a code such as this or any such measures, I feel that it perpetuates a sense of division between men and women. I just do not like the idea that we live in a society where we try to address problems by isolating or categorising ourselves into different groups of people, emphasising the sense of weakness and being victims of any kind of attack or offence from another group, and assuming that everybody who is in the other group will be a perpetrator of some kind of attack, criticism or violence against us.
My view is that, in a world where we see some of this serious activity happening, we should do more to support young men and boys to understand the proper expectations of them. When we get to the groups of amendments on pornography and what more we can do to prevent children’s access to it, I will be much more sympathetic. Forgive me if this sounds like motherhood and apple pie, but I want us to try to generate a society where basic standards of behaviour and social norms are shared between men and women, young and old. I lament how so much of this has broken down, and a lot of the problems we see in society are the fault of political and—dare I say it?—religious leaders not doing more to promote some of those social norms in the past. As I said, I do not want us to respond to the situation we are in by perpetuating more divisions.
I look forward to hearing what my noble friend the Minister has to say, but I am nervous about the solution proposed in the amendments.
My Lords, it gives me great pleasure to follow the noble Baroness, Lady Stowell of Beeston, not least because she became a dissenting voice, and I was dreading that I might be the only one.
First, I think it important that we establish that those of us who have spent decades fighting violence against women and girls are not complacent about it. The question is whether the physical violence we describe in the Bill is the same as the abuse being described in the amendments. I worry about conflating online incivility, abuse and vile things said with physical violence, as is sometimes done.
I note that Refuge, an organisation I have a great deal of respect for, suggested that the user empowerment duties that opted to place the burden on women users to filter out their own online experience was the same as asking women to take control of their own safety and protect themselves offline from violence. I thought that was unfair, because user empowerment duties and deciding what you filter out can be women using their agency.
In that context, I wanted to probe Amendment 104, in the name of the noble Lord, Lord Stevenson of Balmacara, and whether affording, as it states, a higher standard of protection to women and girls could actually be disempowering. I am always concerned about discriminatory special treatment for women. I worry that we end up presenting or describing young women as particularly vulnerable due to their sex, overemphasising victimhood. That, in and of itself, can undermine women’s confidence rather than encouraging them to see themselves as strong, resilient and so on. I was especially worried about Amendment 171, in the name of the noble Baroness, Lady Featherstone, but she is not here to move it. It states that content that promotes or perpetuates violence against women and girls should be removed, and the users removed if they are identified as creating or even disseminating it.
What always worries me about this is Bill that, because we want to improve the world—I know that is the joint enterprise here—we could get carried away. Whereas in law we have a very narrow definition of what incitement to violence is, here we are not very specific about it. I worry about the low threshold whereby somebody who creates a horrible sexist meme will be punished, but then someone who just retweets it will be treated in the same way. I want to be able to have a conversation about why that is the wrong thing to do. I am worried, as I always am, about censorship and so on.
The statistics are a bit confusing on this. There are often-repeated statistics, but you need to dig down, look at academic papers and talk to people who work in this field. An academic paper from Oxford Internet Surveys from August 2021 notes that the exceptional prevalence of online hostility to women is largely based on anecdotal experience, and that a closer look across the British population, contrary to conventional wisdom, shows that women are not necessarily more likely than men to experience hateful speech online. It also notes, however, that there is empirical evidence to show that subgroups of women—it cites journalists and politicians—can be disproportionately targeted. I will not go through all the statistics, although I do have them here, but there were very small differences of 2% or 3%, and in some instances young men were more likely to suffer abuse.
Ofcom’s Online Nation report of June 2022 says that women are more negatively affected by trolling and so on, but again, I worry about the gender point. I am worried that what we are trying to tackle here is what we all know to be a toxic and nasty political atmosphere in society that is reflected online. We all know what we are talking about. People bandy around the most vile labels; we see that regularly on social media, and, if you are a woman, it does take on this nasty, sexist side. It is incredibly unpleasant.
We also have to recognise that that is a broad, moral, social and cultural problem, which I hope that we will try to counter. I am no men’s rights sympathiser by any stretch, but I also noticed that the Ofcom report said that young men are more likely than women to have experienced seeing potentially harmful behaviour or content online in the four weeks before the survey response—64% of men as against 60% of women. Threats of physical violence were more prevalent with boys than women—16% versus 11%—while sexual violence was the other way around. So I want us to have a sense of proportion and say that no one on the receiving end of this harmful, nasty trolling should be ignored, regardless of their sex.
It is also interesting—I will finish with this—that UK women are avid users of social media platforms, spending more than a quarter of their waking hours online and around half an hour more than men each day. We say that the online world is inhospitable to women, but there are a lot of them on it regardless, so we need a sense of perspective. Ofcom’s report makes the point that, often,
“the benefits of being online outweigh the risks”.
More women than men disagree with that; 63% agree with it compared with 71% of males. But we need a sense of perspective here because, actually, the majority of young men and women like being online. Sometimes it can give young women a sense of solidarity and sisterhood. All the surveys that I have read say that female participants feel less able to share their opinions and use their voice online, and we have to ask why.
The majority of young people who I work with are women. They say the reason they dare not speak online is not misogynist hate speech but cancel culture. They are walking on eggshells, as there are so many things that you are not allowed to say. Your Lordships will also be aware that, in gender-critical circles, for example, a lot of misogynistic hate speech is directed at women who are not toeing the line on a particular orthodoxy today. I do not want a remedy for that toxicity, with women not being sure if they can speak out because of cancel culture, if that remedy introduces more censorious trends.
My Lords, it is an honour to follow some very knowledgeable speakers, whose knowledge is much greater than mine. Nevertheless, I feel the importance of this debate above and beyond any other that I can think of on this Bill. However, I do not agree with the noble Baroness, Lady Stowell of Beeston, who said that women should not be victims. They are not victims; they are being victimised. We need a code—the code that is being proposed—not for the victims but for the tech companies, because of the many diverse strands of abuse that women face online. This is an enabler for the tech companies to get their heads around what is coming and to understand it a lot better. It is a helpful tool, not a mollycoddling tool at all.
I strongly agree with everything else, apart from what was said by the noble Baroness, Lady Fox, which I will come on to in a second. I and, I am sure, other noble Lords in this Chamber have had many hundreds of emails from concerned people, ordinary people, who nevertheless understand the importance of what this code of practice will achieve today. I speak for them, as well as the others who have supported this particularly important amendment.
As their supporters have pointed out in this Chamber, Amendments 97 and 304 are the top priority for the Domestic Abuse Commissioner, who believes that, if they do not pass, the Bill will not go far enough to prevent and respond effectively to domestic abuse online. The noble Baroness, Lady Fox, spoke about the need to keep a sense of proportion, but online abuse is everywhere. According to the charity Refuge—I think this was mentioned earlier—over one-third of women and 62% of young women have experienced online abuse and harassment.
I am sure that the Minister is already aware that a sector coalition of experts on violence against women and girls put together the code of practice that we are discussing today. It is needed, as I have said, because of the many strands of abuse that are perpetuated online. However, compliance with the new terms of service to protect women and girls is not cheap. In cost- driven organisations, the temptation will be to relax standards as time goes by, which we have seen in the past in the cases of Facebook and Twitter. The operators’ feet must be held to the fire with this new, stricter and more comprehensive code. People’s lives depend on it.
In his remarks, can the Minister indicate whether the Government are at least willing to look at this code? Otherwise, can he explain how the Government will ensure that domestic abuse and its component offences are understood by providers in the round?
My Lords, I rise to support the noble Baronesses, Lady Morgan and Lady Kidron, the right reverend Prelate the Bishop of Gloucester and the noble Lord, Lord Knight of Weymouth, on Amendment 97 to Clause 36 to mandate Ofcom to produce codes of practice, so that these influential online platforms have to respond adequately to tackle online violence against women and girls.
Why should we care about these codes of practice being in the Bill? Not doing so will have far-reaching consequences, of which we have already heard many examples. First, it will threaten progress on gender equality. As the world moves to an increasingly digital future, with more and more connections and conversations moving online, women must have the same opportunity as men to be a part of the online world and benefit from being in the online space.
Secondly, it will threaten the free speech of women. The voices of women are more likely to be suppressed. Because of abuse, women are more likely to reduce their social media activity or even leave social media platforms altogether.
Thirdly, we will be failing in our obligation to protect the human rights of women. Every woman has the right to be and feel safe online. I thank the noble Baroness, Lady Kidron, who highlighted online abuse due to intersecting identities. The noble Baroness, Lady Stowell, mentioned that this could cause divisions; there are divisions already, given the level of online abuse faced by women. Until we get an equal and just society, additional measures are needed. I know that the noble Baroness, Lady Fox, is worried about censorship, but women also have the right to feel safe online and offline. The noble Baroness is worried about whether this is a proportionate response, but I do feel that it is.
Relying on tech companies to self-regulate on VAWG is a bad idea. At present, the overwhelming majority of tech companies are led by men and their employees are most likely to be men, who will be taking decisions on content and on moderating that content. So we are relying on the judgment of a sector that itself needs to be more inclusive of women and is known for not sufficiently tackling the online abuse of women and girls.
I will give a personal example. Someone did not like what I said on Twitter and posted a message with a picture of a noose, which I found threatening. I reported that and got a response to say that it did not violate terms and conditions, so it remained online.
The culture at these tech companies was illustrated a few years ago when employees at Google walked out to protest against sexism. Also, research a couple of years ago by a campaign group called Global Witness found that Facebook used biased algorithms that promoted career and gender stereotypes, resulting in particular job roles being seen by men and others being seen by women. We know that other algorithms are even more harmful and sinister and promote hatred and misogyny. So relying on a sector that may not care much about women’s rights or their well-being to do the right thing is not going to work. Introducing the VAWG code in the Bill will help to make tech companies adequately investigate and respond to reports of abuse and take a proactive approach to minimise and prevent the risk of abuse taking place in the first instance.
I also add my support to the noble Baroness’s Amendment 304, to Clause 207, so that the definition of violence against women and girls is in line with the gold standard framework provided by the Istanbul convention, which was ratified by the UK Government.
The Bill is a unique opportunity to protect women and girls online, so I commend the Government on introducing it to Parliament, but they could do much more to combat violence against women and girls. If they do not, inaction now will result in us sleepwalking into a culture of normalising despicable behaviour towards women even more openly. If online perpetrators of abuse are not tackled robustly now, it will embolden them further to escalate and gaslight victims. The Government must send perpetrators a message that there is no place to hide—not even online. If the Government want the UK to be the safest place in the world to be online, they should agree to these amendments.
My Lords, I will be very brief. My noble friend has very eloquently expressed the support on these Benches for these amendments, and I am very grateful to the noble Baroness, Lady Morgan, for setting out the case so extremely convincingly, along with many other noble Lords. It is, as the noble Baroness, Lady Kidron, said, about the prevention of the normalisation of misogyny. As my noble friend said, it is for the tech companies to prevent that.
The big problem is that the Government have got themselves into a position where—except in the case of children—the Bill now deals essentially only with illegal harms, so you have to pick off these harms one by one and create illegality. That is why we had the debate in the last group about other kinds of harm. This is another harm that we are debating, precisely because the Government amended the Bill in the Commons in the way that they did. But it does not make this any less important. It is quite clear; we have talked about terms of service, user empowerment tools, lack of enforcement, lack of compliance and all the issues relating to these harms. The use of the expression “chilling effect”—I think by the noble Baroness, Lady Kidron—and then the examples given by the noble Baroness, Lady Gohir, absolutely illustrated that. We are talking about the impact on freedom of expression.
I am afraid that, once again, I do not agree with the noble Baroness, Lady Fox. Why do I find myself disagreeing on such a frequent basis? I think the harms override the other aspects that the noble Baroness was talking about.
We have heard about the lack of a proper complaints system—we are back to complaints again. These themes keep coming through, and until the Government see that there are flaws in the Bill, I do not think we are going to make a great deal more progress. The figure given was that more than half of domestic abuse survivors did not receive a response from the platform to their report of domestic abuse-related content. That kind of example demonstrates that we absolutely need this code.
There is an absolutely convincing case for what one of our speakers, probably the right reverend Prelate, called a holistic way of dealing with these abuses. That is what we need, and that is why we need this code.
My Lords, the amendments in this group, which I am pleased to speak to now, shine a very bright light on the fact that there is no equality when it comes to abuse. We are not starting at a level playing field. This is probably the only place that I do not want to level up; I want to level down. This is not about ensuring that men can be abused as much as women; it is about the very core of what the Bill is about, which is to make this country the safest online space in the world. That is something that unites us all, but we do not start in the same place.
I thank all noble Lords for their very considered contributions in unpicking all the issues and giving evidence about why we do not have that level playing field. Like other noble Lords, I am grateful to the noble Baroness, Lady Morgan, for her thorough, illustrative and realistic introduction to this group of amendments, which really framed it today. Of course, the noble Baroness is supported in signing the amendment by the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester and my noble friend Lord Knight.
The requirement in Amendment 97 that there should be an Ofcom code of practice is recognition that many aspects of online violence disproportionately affect women and girls. I think we always need to come back to that point, because nothing in this debate has taken me away from that very clear and fundamental point. Let us remind ourselves that the online face of violence against women and girls includes—this is not a full list—cyberflashing, abusive pile-ons, incel gangs and cyberstalking, to name but a few. Again, we are not starting from a very simple point; we are talking about an evolving online face of violence against women and girls, and the Bill needs to keep pace.
I associate myself with the words of the noble Baroness, Lady Morgan, and other noble Lords in thanking and appreciating the groups and individuals who have already done the work, and who have—if I might use the term—an oven-ready code of practice available to the Minister, should he wish to avail himself of it. I share the comments about the lack of logic. If violence against women and girls is part of the strategic policing requirement, and the Home Secretary says that dealing with violence against women and girls is a priority, why is this not part of a joined-up government approach? That is what we should now be seeing in the Bill. I am sure the Minister will want to address that question.
The right reverend Prelate the Bishop of Gloucester rightly said that abuse is abuse. Whether it is online or offline, it makes no difference. The positive emphasis should be that women and girls should be able to express themselves online as they should be able to offline. Again, that is a basic underlying point of these amendments.
I listened very closely to the words of the noble Baroness, Lady Stowell. I understand her nervousness, and she is absolutely right to bring before the Committee that perhaps a code of conduct of this nature could allow and encourage, to quote her, division. The challenge we have is that women and girls have a different level of experience. We all want to see higher standards of behaviour, as the noble Baroness referred to—I know that we will come back to that later. However, I cannot see how not having a code of conduct will assist those higher standards because the proposed code of conduct simply acknowledges the reality, which is that women and girls are 27 times more likely to be abused online than men are. I want to put on record that this is not about emphasising division, saying that it is all right to abuse men or, as the noble Baroness gives me the opportunity to say, saying that all men are somehow responsible—far from it. As ever, this is something that unites us all: the tackling of abuse wherever it takes place.
Amendment 104 in the name of my noble friend Lord Stevenson proposes an important change to Schedule 4: that
“women and girls, and vulnerable adults”
should have a higher standard of protection than other adult users. That amendment is there because the Bill is silent on these groups. There is no mention of them, so we seek to change this through that amendment.
To return to the issue of women and girls, two-thirds of women who report abuse to internet companies do not feel heard. Three-quarters of women change their behaviour after receiving online abuse. I absolutely agree with the noble Baroness, Lady Kidron, who made the point that the Bill currently assumes that there is no interconnection between different safety duties where somebody has more than one protected characteristic, because it misses reality. One has only to talk to Jewish women to know that, although anti-Semitism knows no bounds, if you are a Jewish woman then there is no doubt that you will be the subject of far greater abuse than your male counterpart. Similarly, women of colour are one-third more likely to be mentioned in abusive tweets than white women. Again, there is no level playing field.
As it stands, the Bill puts an onus on women and girls to protect themselves from online violence and abuse. The problem, as has been mentioned many times, is that user empowerment tools do not incentivise services to address the design of their service, which may be facilitating the spread of violence against women and girls. That point was very well made by my noble friend Lady Healy and the noble Baroness, Lady Gohir, in their contributions.
On the question of the current response to violence against women and girls from tech companies, an investigation by the Times identified that platforms such as TikTok and YouTube are profiting from a wave of misogynist content, with a range of self-styled “self-help gurus”, inspired by the likes of Andrew Tate, offering advice to their millions of followers, encouraging men and boys, in the way described by the noble Baroness, Lady Stowell, to engage with women and girls in such a way that amounts to pure abuse, instructing boys and men to ensure that women and girls in their lives are “compliant”, “insecure” and “well- behaved”. This is not the kind of online space that we seek.
I hope that the Minister, if he cannot accept the amendments, will give his assurance that he can understand what is behind them and the need for action, and will reflect and come back to your Lordships’ House in a way that can allow us to level down, rather than level up, the amount of abuse that is aimed at men but also, in this case in particular, at women and girls.
My Lords, protecting women and girls is a priority for His Majesty’s Government, at home, on our streets and online. This Bill will provide vital protections for women and girls, ensuring that companies take action to improve their safety online and protect their freedom of expression so that they can continue to play their part online, as well as offline, in our society.
On Amendments 94 and 304, tabled by my noble friend Lady Morgan of Cotes, I want to be unequivocal: all service providers must understand the systemic risks facing women and girls through their illegal content and child safety risk assessments. They must then put in place measures that manage and mitigate these risks. Ofcom’s codes of practice will set out how companies can comply with their duties in the Bill.
I assure noble Lords that the codes will cover protections against violence against women and girls. In accordance with the safety duties, the codes will set out how companies should tackle illegal content and activity confronting women and girls online. This includes the several crimes that we have listed as priority offences, which we know are predominantly perpetrated against women and girls. The codes will also cover how companies should tackle harmful online behaviour and content towards girls.
Companies will be required to implement systems and processes designed to prevent people encountering priority illegal content and minimise the length of time for which any such content is present. In addition, Ofcom will be required to carry out broad consultation when drafting codes of practice to harness expert opinions on how companies can address the most serious online risks, including those facing women and girls. Many of the examples that noble Lords gave in their speeches are indeed reprehensible. The noble Baroness, Lady Kidron, talked about rape threats and threats of violence. These, of course, are examples of priority illegal content and companies will have to remove and prevent them.
My noble friend Lady Morgan suggested that the Bill misses out the specific course of conduct that offences in this area can have. Clause 9 contains provisions to ensure that services
“mitigate and manage the risk of the service being used for the commission or facilitation of”
an offence. This would capture patterns of behaviour. In addition, Schedule 7 contains several course of conduct offences, including controlling and coercive behaviour, and harassment. The codes will set out how companies must tackle these offences where this content contributes to a course of conduct that might lead to these offences.
To ensure that women’s and girls’ voices are heard in all this, the Bill will, as the right reverend Prelate noted, make it a statutory requirement for Ofcom to consult the Victims’ Commissioner and the domestic abuse commissioner about the formation of the codes of practice. As outlined, the existing illegal content, child safety and child sexual abuse and exploitation codes will already cover protections for women and girls. Creating a separate code dealing specifically with violence against women and girls would mean transposing or duplicating measures from these in a separate code.
In its recent communication to your Lordships, Ofcom stated that it will be consulting quickly on the draft illegal content and child sexual abuse and exploitation codes, and has been clear that it has already started the preparatory work for these. If Ofcom were required to create a separate code on violence against women and girls this preparatory work would need to be revised, with the inevitable consequence of slowing down the implementation of these vital protections.
An additional stand-alone code would also be duplicative and could cause problems with interpretation and uncertainty for Ofcom and providers. Linked to this, the simpler the approach to the codes, the higher the rates of compliance are likely to be. The more codes there are covering specific single duties, the more complicated it will be for providers, which will have to refer to multiple different codes, and the harder for businesses to put in place the right protections for users. Noble Lords have said repeatedly that this is a complex Bill, and this is an area where I suggest we should not make it more complex still.
As the Bill is currently drafted, Ofcom is able to draft codes in a way that addresses a range of interrelated risks affecting different groups of users, such as people affected in more than one way; a number of noble Lords dealt with that in their contributions. For example, combining the measures that companies can take to tackle illegal content targeting women and girls with the measures they can take to tackle racist abuse online could ensure a more comprehensive and effective approach that recognises the point, which a number of noble Lords made, that people with more than one protected characteristic under the Equality Act may be at compound risk of harm. If the Bill stipulated that Ofcom separate the offences that disproportionately affect women and girls from other offences in Schedule 7, this comprehensive approach to tackling violence against women and girls online could be lost.
Could my noble friend the Minister confirm something? I am getting rather confused by what he is saying. Is it the case that there will be just one mega code of practice to deal with every single problem, or will there be lots of different codes of practice to deal with the problems? I am sure the tech platforms will have sufficient people to be able to deal with them. My understanding is that Ofcom said that, while the Bill might not mandate a code of practice on violence against women and girls, it would in due course be happy to look at it. Is that right, or is my noble friend the Minister saying that Ofcom will never produce a code of practice on violence against women and girls?
It is up to Ofcom to decide how to set the codes out. What I am saying is that the codes deal with specific categories of threat or problem—illegal content, child safety content, child sexual abuse and exploitation—rather than with specific audiences who are affected by these sorts of problems. There is a circularity here in some of the criticism that we are not reflecting the fact that there are compound harms to people affected in more than one way and then saying that we should have a separate code dealing with one particular group of people because of one particular characteristic. We are trying to deal with categories of harm that we know disproportionately affect women and girls but which of course could affect others, as the noble Baroness rightly noted. Amendment 304—
I thank the Minister for giving way. There is a bit of a problem that I would like to raise. I think the Minister is saying that there should not be a code of practice in respect of violence against women and girls. That sounds to me like there will be no code of practice in this one particular area, which seems rather harsh. It also does not tackle the issue on which I thought we were all agreed, even if we do not agree the way forward: namely, that women and girls are disproportionately affected. If it is indeed the case that the Minister feels that way, how does he suggest this is dealt with?
There are no codes designed for Jewish people, Muslim people or people of colour, even though we know that they are disproportionately affected by some of these harms as well. The approach taken is to tackle the problems, which we know disproportionately affect all of those groups of people and many more, by focusing on the harms rather than the recipients of the harm.
Can I check something with my noble friend? This is where the illogicality is. The Government have mandated in the Strategic Policing Requirement that violence against women and girls is a national threat. I do not disagree with him that other groups of people will absolutely suffer abuse and online violence, but the Government themselves have said that violence against women and girls is a national threat. I understand that my noble friend has the speaking notes, the brief and everything else, so I am not sure how far we will get on this tonight, but, given the Home Office stance on it, I think that to say that this is not a specific threat would be a mistake.
With respect, I do not think that that is a perfect comparison. The Strategic Policing Requirement is an operational policing document intended for chief constables and police and crime commissioners in the important work that they do, to make sure they have due regard for national threats as identified by the Home Secretary. It is not something designed for commercial technology companies. The approach we are taking in the Bill is to address harms that can affect all people and which we know disproportionately affect women and girls, and harms that we know disproportionately affect other groups of people as well.
We have made changes to the Bill: the consultation with the Victims’ Commissioner and the domestic abuse commissioner, the introduction of specific offences to deal with cyber-flashing and other sorts of particular harms, which we know disproportionately affect women and girls. We are taking an approach throughout the work of the Bill to reflect those harms and to deal with them. Because of that, respectfully, I do not think we need a specific code of practice for any particular group of people, however large and however disproportionately they are affected. I will say a bit more about our approach. I have said throughout, including at Second Reading, and my right honourable friend the Secretary of State has been very clear in another place as well, that the voices of women and girls have been heard very strongly and have influenced the approach that we have taken in the Bill. I am very happy to keep talking to noble Lords about it, but I do not think that the code my noble friend sets out is the right way to go about solving this issue.
Amendment 304 seeks to adopt the Istanbul convention definition of violence against women and girls. The Government are already compliant with the Convention on Preventing and Combating Violence Against Women and Domestic Violence, which was ratified last year. However, we are unable to include the convention’s definition of violence against women and girls in the Bill, as it extends to legal content and activity that is not in scope of the Bill as drafted. Using that definition would therefore cause legal uncertainty for companies. It would not be appropriate for the Government to require companies to remove legal content accessed by adults who choose to access it. Instead, as noble Lords know, the Government have brought in new duties to improve services’ transparency and accountability.
Amendment 104 in the name of the noble Lord, Lord Stevenson, seeks to require user-to-user services to provide a higher standard of protection for women, girls and vulnerable adults than for other adults. The Bill already places duties on service providers and Ofcom to prioritise responding to content and activity that presents the highest risk of harm to users. This includes users who are particularly affected by online abuse, such as women, girls and vulnerable adults. In overseeing the framework, Ofcom must ensure that there are adequate protections for those who are most vulnerable to harm online. In doing so, Ofcom will be guided by its existing duties under the Communications Act, which requires it to have regard when performing its duties to the
“vulnerability of children and of others whose circumstances appear to OFCOM to put them in need of special protection”.
The Bill also amends Ofcom’s general duties under the Communications Act to require that Ofcom, when carrying out its functions, considers the risks that all members of the public face online, and ensures that they are adequately protected from harm. This will form part of Ofcom’s principal duty and will apply to the way that Ofcom performs all its functions, including when producing codes of practice.
In addition, providers’ illegal content and child safety risk assessment duties, as well as Ofcom’s sectoral risk assessment duties, require them to understand the risk of harm to users on their services. In doing so, they must consider the user base. This will ensure that services identify any specific risks facing women, girls or other vulnerable groups of people.
As I have mentioned, the Bill will require companies to prioritise responding to online activity that poses the greatest risk of harm, including where this is linked to vulnerability. Vulnerability is very broad. The threshold at which somebody may arguably become vulnerable is subjective, context-dependent and maybe temporary. The majority of UK adult users could be defined as vulnerable in particular circumstances. In practice, this would be very challenging for Ofcom to interpret if it were added to the safety objectives in this way. The existing approach allows greater flexibility so that companies and Ofcom can focus on the greatest threats to different groups of people at any given time. This allows the Bill to adapt to and keep pace with changing risk patterns that may affect different groups of people.
I am conscious that, for understandable reasons, the noble Baroness, Lady Featherstone, is not here to speak to her Amendment 171. I will touch on it briefly in her absence. It relates to platform transparency about providers’ approaches to content that promotes violence against women, girls and vulnerable groups. Her amendment raises an extremely important issue. It is essential that the Bill increase transparency about the abuse of women, girls and vulnerable people online. This is why the Bill already empowers Ofcom to request extensive information about illegal and harmful online abuse of women, girls and vulnerable groups in transparency reports. Accepting this amendment would therefore be duplicative. I hope that the noble Baroness would agree that it is not needed.
I hope that I have given some reassurance that the Bill covers the sort of violent content about which noble Lords are rightly concerned, no matter against whom it is directed. The Government recognise that many of these offences and much of the violence does disproportionately affect women and girls in the way that has been correctly pointed out. We have reflected this in the way in which the Bill and its regulatory framework are to operate. I am happy to keep discussing this matter with my noble friend. She is right that it is important, but I hope that, at this juncture, she will be content to withdraw her amendment.
My Lords, I thank my noble friend for his response, which I will come on to in a moment. This has been a fascinating debate. Yet again, it has gone to the heart of some of the issues with this Bill. I thank all noble Lords who have spoken, even though I did not quite agree with everything they said. It is good that this Committee shows just how seriously it takes the issue of violence against women and girls. I particularly thank all those who are watching from outside. This issue is important to so many.
There is no time to run through all the brilliant contributions that have been made. I thank the right reverend Prelate the Bishop of Gloucester for her support. She made the point that, these days, for most people, there is no online/offline distinction. To answer one of the points made, we sometimes see violence or abuse that starts online and then translates into the offline world. Teachers in particular are saying that this is the sort of misogyny they are seeing in classrooms.
As the noble Baroness, Lady Merron, said, the onus should not be on women and girls to remove themselves from online spaces. I also thank the noble Baronesses, Lady Kidron and Lady Gohir, for their support. The noble Baroness, Lady Kidron, talked about the toxic levels of online violence. Parliament needs to say that this is not okay—which means that we will carry on with this debate.
I thank the noble Baroness, Lady Healy, for her contribution. She illustrated so well why a code of practice is needed. We can obviously discuss this, but I do not think the Minister is quite right about the user reporting element. For example, we have heard various women speaking out who have had multiple rape threats. At the moment, the platforms require each one to be reported individually. They do not put them together and then work out the scale of threat against a particular user. I am afraid that this sort of threat would not breach the illegal content threshold and therefore would not be caught by the Bill, despite what the Minister has been saying.
I agree with my noble friend Lady Stowell. I would love to see basic standards—I think she called it “civility” —and a better society between men and women. One of the things that attracts me most to the code of practice is that it seeks cultural and societal changes—not just whack-a-mole with individual offences but changing the whole online culture to build a healthier and better society.
I will certainly take up the Minister’s offer of a meeting. His response was disappointing. There was no logic to it at all. He said that the voice of women and girls is heard throughout the Bill. How can this be the case when the very phrase “women and girls” is not mentioned in 262 pages? Some 100,000 people outside this Chamber disagree with his position and on the need for there to be a code of practice. I say to both Ofcom and the tech platforms that a code has been drafted. Please do not do the “Not drafted here; we’re not going to adopt it”. It is there, the work has been done and it can easily be taken on.
I would be delighted to discuss the definition in Amendment 304 with my noble friend. I will of course withdraw my amendment tonight, but we will certainly return to this on Report.
Amendment 97 withdrawn.
Amendment 98 not moved.
House resumed. Committee to begin again not before 8.45 pm.