Skip to main content

Online Safety Bill

Volume 829: debated on Tuesday 9 May 2023

Committee (5th Day) (Continued)

Clause 12: User empowerment duties

Debate on Amendment 38 resumed.

My Lords, before we continue this debate, I want to understand why we have changed the system so that we break part way through a group of amendments. I am sorry, but I think this is very poor. It is definitely a retrograde step. Why are we doing it? I have never experienced this before. I have sat here and waited for the amendment I have just spoken to. We have now had a break; it has broken the momentum of that group. It was even worse last week, because we broke for several days half way through the debate on an amendment. This is unheard of in my memory of 25 years in this House. Can my noble friend the Minister explain who made this decision, and how this has changed?

I have not had as long in your Lordships’ House, but this is not unprecedented, in my experience. These decisions are taken by the usual channels; I will certainly feed that back through my noble friend. One of the difficulties, of course, is that because there are no speaking limits on legislation and we do not know how many people want to speak on each amendment, the length of each group can be variable, so I think this is for the easier arrangement of dinner-break business. Also, for the dietary planning of those of us who speak on every group, it is useful to have some certainty, but I do appreciate my noble friend’s point.

Okay; I thank my noble friend for his response. However, I would just say that we never would have broken like that, before 7.30 pm. I will leave it at that, but I will have a word with the usual channels.

My Lords, I rise to speak to Amendments 141 and 303 in the name of the noble Lord, Lord Stevenson. Before I do, I mention in passing how delighted I was to see Amendment 40, which carries the names of the Minister and the noble Lord, Lord Stevenson—may there be many more like that.

I am concerned that without Amendments 141 and 303, the concept of “verified” is not really something that the law can take seriously. I want to ask the Minister two rather technical questions. First, how confident can the Government and Ofcom be that with the current wording, Ofcom could form an assessment of whether Twitter’s current “verified by blue” system satisfies the duty in terms of robustness? If it does not, does Ofcom have the power to send it back to the drawing board? I am sure noble Lords understand why I raise this: we have recently seen “verified by blue” ticks successfully bought by accounts impersonating Martin Lewis, US Senators and Putin propagandists. My concern is that in the absence of a definition of verification in the Bill such as the one proposed in Amendments 141 and 303, where in the current wording does Ofcom have the authority to say that “verified by blue” does not satisfy the user verification duty?

My second question is similar. We see now around the world—it is not available in the UK—that Meta has a verified subscription, for which you can pay around $15 per month. It is being piloted in the US as we speak. Again, I ask whether that satisfies the duty in terms of it being affordable to the average UK user. I am concerned that most UK social media users will not be able to afford £180 per social media account for verification. If that ends up being Meta’s UK offering, many users would not be given a proper, meaningful chance to be verified. What powers are there in the Bill for Ofcom to send Meta back and offer something else? So my questions really are about what “verified” means in terms of the Bill.

My Lords, I rise to speak to Amendment 141 in the names of the noble Lords, Lord Stevenson and Lord Clement-Jones. Once again, I register the support of my noble friend Lady Campbell of Surbiton, who feels very strongly about this issue.

Of course, there is value in transparency online, but anonymity can be vital for certain groups of people, such as those suffering domestic abuse, those seeking help or advice on matters they wish to remain confidential, or those who face significant levels of hatred or prejudice because of who they are, how they live or what they believe in. Striking the right balance is essential, but it is equally important that everyone who wishes to verify their identity and access the additional protections that this affords can do so easily and effectively, and that this opportunity is open to all.

Clause 57 requires providers of category 1 services to offer users the option to verify their identity, but it is up to providers to decide what form of verification to offer. Under subsection (2) it can be “of any kind”, and it need not require any documentation. Under subsection (3), the terms of service must include a “clear and accessible” explanation of how the process works and what form of verification is available. However, this phrase in itself is open to interpretation: clear and accessible for one group may be unclear and inaccessible to another. Charities including Mencap are concerned that groups, such as people with a learning disability, could be locked out of using these tools.

It is also relevant that people with a learning disability are less likely to own forms of photographic ID such as passports or driving licences. Should a platform require this type of ID, large numbers of people with a learning disability would be denied access. In addition, providing an email or phone number and verifying this through an authentication process could be extremely challenging for those people who do not have the support in place to help them navigate this process. This further disadvantages groups of people who already suffer some of the most extensive restrictions in living their everyday lives.

Clause 58 places a duty on Ofcom to provide guidance to help providers comply with their duty, but this guidance is optional. Amendment 141 aims to strengthen Clause 58 by requiring Ofcom to set baseline principles and standards for the guidance. It would ensure, for example, that the guidance considers accessibility for disabled as well as vulnerable adults and aligns with relevant guidance on related matters such as age verification; it would ensure that verification processes are effective; and it would ensure that the interests of disabled users are covered in Ofcom’s pre-guidance consultation.

Online can be a lifeline for disabled and vulnerable adults, providing access to support, advice and communities of interest, and this is particularly important as services in the real world are diminishing, so we need to ensure that user-verification processes do not act as a further barrier to inclusion for people with protected characteristics, especially those with learning disabilities.

My Lords, the speech of the noble Baroness, Lady Buscombe, raised so many of the challenges that people face online, and I am sure that the masses who are watching parliamentlive as we speak, even if they are not in here, will recognise what she was talking about. Certainly, some of the animal rights activists can be a scourge, but I would not want to confine this to them, because I think trashing reputations online and false allegations have become the activists’ chosen weapon these days. One way that I describe cancel culture, as distinct from no-platforming, is that it takes the form of some terrible things being said about people online, a lot of trolling, things going viral and using the online world to lobby employers to get people sacked, and so on. It is a familiar story, and it can be incredibly unpleasant. The noble Baroness and those she described have my sympathy, but I disagree with her remedy.

An interesting thing is that a lot of those activities are not carried out by those who are anonymous. It is striking that a huge number of people with large accounts, well-known public figures with hundreds of thousands of followers—sometimes with more than a million—are prepared to do exactly what I described in plain sight, often to me. I have thought long and hard about this, because I really wanted to use this opportunity to read out a list and name and shame them, but I have decided that, when they go low, I will try to go at least a little higher. But subtweeting and twitchhunts are an issue, and one reason why we think we need an online harms Bill. As I said, I know that sometimes it can feel that if people are anonymous, they will say things that they would not say to your face or if you knew who they were, but I think it is more the distance of being online: even when you know who they are, they will say it to you or about you online, and then when you see them at the drinks reception, they scuttle away.

My main objection, however, to the amendment of the noble Baroness, Lady Buscombe, and the whole question of anonymity in general is that it treats anonymity as though it is inherently unsafe. There is a worry, more broadly on verification, about creating two tiers of users: those who are willing to be verified and those who are not, and those who are not somehow having a cloud of suspicion over them. There is a danger that undermining online anonymity in the UK could set a terrible precedent, likely to be emulated by authoritarian Governments in other jurisdictions, and that is something we must bear in mind.

On evidence, I was interested in Big Brother Watch’s report on some analysis by the New Statesman, which showed that there is little evidence to suggest that anonymity itself makes online discourse more febrile. It did an assessment involving tweets sent to parliamentarians since January 2021, and said there was

“little discernible difference in the nature or tone of the tweets that MPs received from anonymous or non-anonymous accounts. While 32 per cent of tweets from anonymous accounts were classed as angry according to the metric used by the New Statesman, so too were 30 per cent of tweets from accounts with full names attached.18 Similarly, 5.6 per cent of tweets from anonymous accounts included swear words, only slightly higher than the figure of 5.3 per cent for named accounts.”

It went through various metrics, but it said, “slightly higher, not much of a difference”. That is to be borne in mind: the evidence is not there.

In this whole debate, I have wanted to emphasise freedom as at least equal to, if not of greater value than, the safetyism of this Bill, but in this instance, I will say that, as the noble Baroness, Lady Bull, said, for some people anonymity is an important safety mechanism. It is a tool in the armoury of those who want to fight the powerful. It can be anyone: for young people experimenting with their sexuality and not out, it gives them the freedom to explore that. It can be, as was mentioned, survivors of sexual violence or domestic abuse. It is certainly crucial to the work of journalists, civil liberties activists and whistleblowers in the UK and around the world. Many of the Iranian women’s accounts are anonymous: they are not using their correct names. The same is true of Hong Kong activists; I could go on.

Anyway, in our concerns about the Bill, compulsory identity verification means being forced to share personal data, so there is a privacy issue for everyone, not just the heroic civil liberties people. In a way, it is your own business why you are anonymous—that is the point I am trying to make.

There are so many toxic issues at the moment that a lot of people cannot just come out. I know I often mention the gender-critical issue, but it is true that in many professions, you cannot give your real name or you will not just be socially ostracised but potentially jeopardise your career. I wrote an article during the 2016-17 days called Meet the Secret Brexiteers. It was true that many teachers and professors I knew who voted to leave had to be anonymous online or they would not have survived the cull.

Finally, I do not think that online anonymity or pseudonymity is a barrier to tracking down and prosecuting those who commit the kind of criminal activity on the internet described, creating some of the issues we are facing. Police reports show that between 2017-18, 96% of attempts by public authorities to identify anonymous users of social media accounts, their email addresses and telephone numbers, resulted in successful identification of the suspect in the investigation; in other words, the police already have a range of intrusive powers to track down individuals, should there be a criminal problem, and the Investigatory Powers Act 2016 allows the police to acquire communications data—for example, email addresses or the location of a device—from which alleged illegal anonymous activity is conducted and use it as evidence in court.

If it is not illegal but just unpleasant, I am afraid that is the world we live in. I would argue that what we require in febrile times such as these is not bans or setting the police on people but to set the example of civil discourse, have more speech and show that free speech is a way of conducting disagreement and argument without trashing reputations.

My Lords, what an unusually reticent group we have here for this group of amendments. I had never thought of the noble Baroness, Lady Fox, as being like Don Quixote, but she certainly seems to be tilting at windmills tonight.

I go back to the Joint Committee report, because what we said there is relevant. We said:

“Anonymous abuse online is a serious area of concern that the Bill needs to do more to address. The core safety objectives apply to anonymous accounts as much as identifiable ones. At the same time, anonymity and pseudonymity are crucial to online safety for marginalised groups, for whistleblowers, and for victims of domestic abuse and other forms of offline violence. Anonymity and pseudonymity themselves are not the problem and ending them would not be a proportionate response”.

We were very clear; the Government’s response on this was pretty clear too.

We said:

“The problems are a lack of traceability by law enforcement, the frictionless creation and disposal of accounts at scale, a lack of user control over the types of accounts they engage with and a failure of online platforms to deal comprehensively with abuse on their platforms”.

We said there should be:

“A requirement for the largest and highest risk platforms to offer the choice of verified or unverified status and user options on how they interact with accounts in either category”.

Crucially for these amendments, we said:

“We recommend that the Code of Practice also sets out clear minimum standards to ensure identification processes used for verification protect people’s privacy—including from repressive regimes or those that outlaw homosexuality”.

We were very clear about the difference between stripping away anonymity and ensuring that verification was available where the user wanted to engage only with those who had verified themselves. Requiring platforms to allow users—

I am sorry to interrupt the noble Lord, but I would like to ask him whether, when the Joint Committee was having its deliberations, it ever considered, in addition to people’s feelings and hurt, their livelihoods.

Of course. I think we looked at it in the round and thought that stripping away anonymity could in many circumstances be detrimental to those, for instance, working in hostile regimes or regimes where human rights were under risk. We considered a whole range of things, and the whole question about whether you should allow anonymity is subject to those kinds of human rights considerations.

I take the noble Baroness’s point about business, but you have to weigh up these issues, and we came around the other side.

Does the noble Lord not think that many people watching and listening to this will be thinking, “So people in far-off regimes are far more important than I am—I who live, work and strive in this country”? That is an issue that I think was lacking through the whole process and the several years that this Bill has been discussed. Beyond being hurt, people are losing their livelihoods.

I entirely understand what the noble Baroness is saying, and I know that she feels particularly strongly about these issues given her experiences. The whole Bill is about trying to weigh up different aspects—we are on day 5 now, and this has been very much the tenor of what we are trying to talk about in terms of balance.

I want to reassure the noble Baroness that we did discuss anonymity in relation to the issues that she has put forward. A company should not be able to use anonymity as an excuse not to deal with the situation, and that is slightly different from simply saying, “We throw our hands up on those issues”.

There was a difference between the fact that companies are using anonymity to say, “We don’t know who it is, and therefore we can’t deal with it”, and the idea that they should take action against people who are abusing the system and the terms of service. It is subtle, but it is very meaningful in relation to what the noble Baroness is suggesting.

That is a very fair description. We have tried to emphasise throughout the discussion on the Bill that it is about not just content but how the system and algorithms work in terms of amplification. In page 35 of our report, we try to address some of those issues—it is not central to the point about anonymity, but we certainly talked about the way that messages are driven by the algorithm. Obviously, how that operates in practice and how the Bill as drafted operates is what we are kicking the tyres on at the moment, and the noble Baroness is absolutely right to do that.

The Government’s response was reasonably satisfactory, but this is exactly why this group explores the definition of verification and so on, and tries to set standards for verification, because we believe that there is a gap in all this. I understand that this is not central to the noble Baroness’s case, but—believe me—the discussion of anonymity was one of the most difficult issues that we discussed in the Joint Committee, and you have to fall somewhere in that discussion.

Requiring platforms to allow users to see other users’ verification status is a crucial further pillar to user empowerment, and it provides users with a key piece of information about other users. Being able to see whether an account is verified would empower victims of online abuse or threats—I think this partly answers the noble Baroness’s question—to make more informed judgments about the source of the problem, and therefore take more effective steps to protect themselves. Making verification status visible to all users puts more choice in their hands as to how they manage the higher risks associated with non-verified and anonymous accounts, and offers them a lighter-touch alternative to filtering out all non-verified users entirely.

We on these Benches support the amendments that have been put forward. Amendment 141 aims to ensure that a user verification duty delivers in the way that the public and Government hope it will—by giving Ofcom a clear remit to require that the verification systems that platforms are required to develop in response to the duty are sufficiently rigorous and accessible to all users.

I was taken by what the noble Baroness, Lady Bull, said, particularly the case for Ofcom’s duties as regards those with disabilities. We need Ofcom to be tasked with setting out the principles and minimum standards, because otherwise platforms will try to claim, as verification, systems that do not genuinely verify a user’s identity, are unaffordable to ordinary users or use their data inappropriately.

Likewise, we support Amendment 303, which would introduce a definition of “user identity verification” into the Bill to ensure that we are all on the same page. In Committee in the House of Commons, Ministers suggested that “user identity verification” is an everyday term so does not need a definition. This amendment, which no doubt the noble Baroness, Lady Merron, will speak to in more detail, is bang on point as far as that is concerned. That was not a convincing answer, and that is why this amendment is particularly apt.

I heard what the noble Baroness, Lady Buscombe, had to say, but in many ways the amendment in the previous group in the name of the noble Lord, Lord Knight, met some of the noble Baroness’s concerns. As regards the amendment in the name of the noble Lord, Lord Moylan, we are all Wikipedia fans, so we all want to make sure that there is no barrier to Wikipedia operating successfully. I wonder whether perhaps the noble Lord is making quite a lot out of the Wikipedia experience, but I am sure the Minister will enlighten us all and will have a spot-on response for him.

My Lords, I am pleased to speak on this group of amendments, and I will particularly address the amendments in the name of my noble friend Lord Stevenson. To start with the very positive, I am very grateful to the Minister for signing Amendment 40 —as has already been commented, this is hopefully a sign of things to come. My observation is that it is something of a rarity, and I am containing my excitement as it was agreement over one word, “effectively”. Nevertheless, it is very welcome support.

These amendments aim to make it clearer to users whether those whom they interact with are verified or non-verified, with new duties backed up by a set of minimum standards, to be reflected in Ofcom’s future guidance on the user verification duty, with standards covering—among other things—privacy and data protection. The noble Lord, Lord Clement-Jones, helpfully referred your Lordships’ House to the report of the Joint Committee and spent some useful time on the challenges over anonymity. As is the case with so many issues on other Bills and particularly on this one, there is a balance to be struck. Given the proliferation of bots and fake profiles, we must contemplate how to give confidence to people that they are interacting with real users.

Amendment 141 tabled by my noble friend Lord Stevenson and supported by the noble Lord, Lord Clement- Jones, requires Ofcom to set a framework of principles and minimum standards for the user verification duty. The user verification duty is one of the most popular changes to be made to the Bill following the pre-legislative scrutiny process and reflects a recommendation of the Joint Committee. Why is it popular? Because the public understand that the current unregulated approach by social media platforms is a major enabler of harmful online behaviour. Anonymous accounts are more likely to engage in abuse or harassment and, for those at the receiving end, threats from anonymous accounts can feel even more frightening, while the chances are lower of any effective enforcement from the police or platforms.

As we know, bad actors use networks of fake accounts to peddle disinformation and divisive conspiracy theories. I am sure that we will come back to this in later groups. This amendment aims to ensure that the user verification duty delivers in the way that the public and the Government hope that it will. It requires that the systems which platforms develop in response to the duty are sufficiently rigorous and accessible to all users.

The noble Baroness, Lady Kidron, talked about affordability, something that I would like to amplify. There will potentially be platforms which try to claim that verification systems somehow genuinely verify a user’s identity when they do not, or they will be unaffordable to ordinary users, as the noble Baroness said, or data will be used inappropriately. This is not theoretical. She referred to the Meta-verified product, which looks like it might be more rigorous, but at a cost of $180 per year per account, which will not be within the grasp of many people. Twitter is now also selling blue ticks of verification for $8, including a sale to those who are scamming, impersonating, and who are propagandists for figures in our world such as Putin. This amendment future-proofs and allows flexibility. It will not tie the hands of either the regulator or the platforms. Therefore, I hope that it can find some favour with the Minister.

In Amendment 303, again tabled by my noble friend Lord Stevenson and supported by the noble Lord, Lord Clement-Jones, there is an addition of the definition of “user identity verification”. I agree with the noble Lord about how strange it was that, in Committee in the Commons, Ministers felt that user identity verification was somehow an everyday term which did not need definition. I dispute that. It is no better left to common sense than any other terms that we do have definitions for in Clause 207—for example, “age assurance”, “paid-for advertisement” and “terms of service”. All these get definitions. Surely it is very wise to define user identity verification.

Without definition, there is obviously scope for dispute about how verification is defined. As we heard earlier in Committee, a dispute over what something means only creates the conditions for uncertainty, delay and legal costs. Therefore, I hope that we can see a brief definition that provides clarity for regulators and platforms and reduces the potential for disputes and enforcement delays. If we could rely on platforms to operate in good faith, in the interests of all of us, we would not even need the Bill.

Amendment 41, again tabled by my noble friend Lord Stevenson and supported by the noble Lord, Lord Clement-Jones, would require category 1 services to make visible to users whether another user is verified or non-verified. There is already a duty to allow users to be verified and to allow all users to filter out interaction with unverified accounts, but these duties must be—to use that word again—effective.

In cases of fraud, we well know that online scammers rely heavily on deceptive fake accounts, often backed up by reviews from other fake accounts, and that they will think twice about going through any credible verification process because it will make them more traceable. So a simple and clear piece of advice, if we become able to use it, would be to check if the user you are interacting with is verified. That would be powerful advice for consumers to help them avoid fraud.

In the case of disinformation—again, something we will return to in a later group—bad actors, including foreign Governments, are setting up networks of fake accounts which make all sorts of false claims about their identity: maybe that they are a doctor, a British Army veteran or an expert in vaccines. We have seen and heard them all. We ask the public to check the source of the information they read, and that would be a lot easier if it was obvious who is verified and who is not. For those who are subject to online abuse or threats, being able to see if an account is verified would empower them to make more informed decisions about the source of the problem, and therefore to take more definitive steps to protect themselves.

It is absolutely right, as the noble Baronesses, Lady Bull and Lady Fox, outlined, that there are very legitimate reasons why some people do not want their identity shared when they are using a service. This issue was raised with me by a number of young people that I, like other noble Lords, had the opportunity to speak to at a meeting organised by the NSPCC. They explained how they experienced the online world and how they wanted to be able to use it, but there are times when they need to protect their identity in order to benefit from using it and to explore various aspects of themselves, and I believe we should enable that protection.

Amendments in this group from the noble Lord, Lord Moylan, bring us back to previous debates on crowdsourced sites such as Wikipedia, so I will not repeat the same points made in previous debates, but I feel sure that the Minister will provide the reassurance that the noble Lord seeks, and we all look forward to it.

I have a question for the Minister in concluding my comments on this group. Could he confirm whether, under the current provisions, somebody’s full name would have to be publicly displayed for the verification duty to have been met, or could they use a pseudonym or a generic username publicly, with verification having taken place in a private and secure manner? I look forward to hearing from the Minister.

My Lords, the range of the amendments in this group indicates the importance of the Government’s approach to user verification and non-verified user duties. The way these duties have been designed seeks to strike a careful balance between empowering adults while safeguarding privacy and anonymity.

Amendments 38, 39, 139 and 140 have been tabled by my noble friend Lord Moylan. Amendments 38 and 39 seek to remove subsections (6) and (7) of the non-verified users’ duties. These place a duty on category 1 platforms to give adult users the option of preventing non-verified users interacting with their content, reducing the likelihood that a user sees content from non-verified users. I want to be clear that these duties do not require the removal of legal content from a service and do not impinge on free speech.

In addition, there are already existing duties in the Bill to safeguard legitimate online debate. For example, category 1 services will be required to assess the impact on free expression of their safety policies, including the impact of their user empowerment tools. Removing subsections (6) and (7) of Clause 12 would undermine the Bill’s protection for adult users of category 1 services, especially the most vulnerable. It would be entirely at the service provider’s discretion to offer users the ability to minimise their exposure to anonymous and abusive users, sometimes known as trolls. In addition, instead of mandating that users verify their identity, the Bill gives adults the choice. On that basis, I am confident that the Bill already achieves the effect of Amendment 139.

Amendment 140 seeks to reduce the amount of personal data transacted as part of the verification process. Under subsection (3) of Clause 57, however, providers will be required to explain in their terms of service how the verification process works, empowering users to make an informed choice about whether they wish to verify their identity. In addition, the Bill does not alter the UK’s existing data protection laws, which provide people with specific rights and protections in relation to the processing of their personal data. Ofcom’s guidance in this area will reflect existing laws, ensuring that users’ data is protected where personal data is processed. I hope my noble friend will therefore be reassured that these duties reaffirm the concept of choice and uphold the importance of protecting personal data.

While I am speaking to the questions raised by my noble friend, I turn to those he asked about Wikipedia. I have nothing further to add to the comments I made previously, not least that it is impossible to pre-empt the assessments that will be made of which services fall into which category. Of course, assessments will be made at the time, based on what the services do at the time of the assessment, so if he will forgive me, I will not be drawn on particular services.

To speak in more general terms, category 1 services are those with the largest reach and the greatest influence over public discourse. The Bill sets out a clear process for determining category 1 providers, based on thresholds set by the Secretary of State in secondary legislation following advice from Ofcom. That is to ensure that the process is objective and evidence based. To deliver this advice, Ofcom will undertake research into the relationship between how quickly, easily and widely user-generated content is disseminated by that service, the number of users and functionalities it has and other relevant characteristics and factors.

Will my noble friend at least confirm what he said previously: namely, that it is the Government’s view—or at least his view—that Wikipedia will not qualify as a category 1 service? Those were the words I heard him use at the Dispatch Box.

That is my view, on the current state of play, but I cannot pre-empt an assessment made at a point in the future, particularly if services change. I stand by what I said previously, but I hope my noble friend will understand if I do not elaborate further on this, at the risk of undermining the reassurance I might have given him previously.

Amendments 40, 41, 141 and 303 have been tabled by the noble Lord, Lord Stevenson of Balmacara, and, as noble Lords have noted, I have added my name to Amendment 40. I am pleased to say that the Government are content to accept it. The noble Baroness, Lady Merron, should not minimise this, because it involves splitting an infinitive, which I am loath to do. If this is a statement of intent, I have let that one go, in the spirit of consensus. Amendment 40 amends Clause 12(7) to ensure that the tools which will allow adult users to filter out content from non-verified users are effective and I am pleased to add my name to it.

Amendment 41 seeks to make it so that users can see whether another user is verified or not. I am afraid we are not minded to accept it. While I appreciate the intent, forcing users to show whether they are verified or not may have unintended consequences for those who are unable to verify themselves for perfectly legitimate reasons. This risks creating a two-tier system online. Users will still be able to set a preference to reduce their interaction with non-verified users without making this change.

Amendment 141 seeks to prescribe a set of principles and standards in Ofcom’s guidance on user verification. It is, however, important that Ofcom has discretion to determine, in consultation with relevant persons, which principles will have the best outcomes for users, while ensuring compliance with the duties. Further areas of the Bill also address several issues raised in this amendment. For example, all companies in scope will have a specific legal duty to have effective user reporting and redress mechanisms.

Existing laws also ensure that Ofcom’s guidance will reflect high standards. For example, it is a general duty of Ofcom under Section 3 of the Communications Act 2003 to further the interests of consumers, including by promoting competition. This amendment would, in parts, duplicate existing duties and undermine Ofcom’s independence to set standards on areas it deems relevant after consultation with expert groups.

Amendment 303 would add a definition of user identity verification. The definition it proposes would result in users having to display their real name online if they decide to verify themselves. In answer to the noble Baroness’s question, the current requirements do not specify that users must display their real name. The amendment would have potential safety implications for vulnerable users, for example victims and survivors of domestic abuse, whistleblowers and others of whom noble Lords have given examples in their contributions. The proposed definition would also create reliance on official forms of identification. That would be contrary to the existing approach in Clause 57 which specifically sets out that verification need not require such forms of documentation.

The noble Baroness, Lady Kidron, talked about paid-for verification schemes. The user identity verification provisions were brought in to ensure that adult users of the largest services can verify their identity if they so wish. These provisions are different from the blue tick schemes and others currently in place, which focus on a user’s status rather than verifying their identity. Clause 57 specifically sets out that providers of category 1 services will be required to offer all adult users the option to verify their identity. Ofcom will provide guidance for user identity verification to assist providers in complying with these duties. In doing so, it will consult groups that represent the interests of vulnerable adult users. In setting out recommendations about user verification, Ofcom must have particular regard to ensuring that providers of category 1 services offer users a form of identity verification that is likely to be available to vulnerable adult users. Ofcom will also be subject to the public sector equality duty, so it will need to take into account the ways in which people with certain characteristics may be affected when it performs this and all its duties under the Bill.

A narrow definition of identity verification could limit the range of measures that service providers might offer their users in the future. Under the current approach, Ofcom will produce and publish guidance on identity verification after consulting those with technical expertise and groups which represent the interests of vulnerable adult users.

I am sorry to interrupt the noble Lord. Is the answer to my question that the blue tick and the current Meta system will not be considered as verification under the terms of the Bill? Is that the implication of what he said?

Yes. The blue tick is certainly not identity verification. I will write to confirm on Meta, but they are separate and, as the example of blue ticks and Twitter shows, a changing feast. That is why I am talking in general terms about the approach, so as not to rely too much on examples that are changing even in the course of this Committee.

Government Amendment 43A stands in my name. This clarifies that “non-verified user” refers to users whether they are based in the UK or elsewhere. This ensures that, if a UK user decides he or she no longer wishes to interact with non-verified users, this will apply regardless of where they are based.

Finally, Amendment 106 in the name of my noble friend Lady Buscombe would make an addition to the online safety objectives for regulated user-to-user services. It would amend them to make it clear that one of the Bill’s objectives is to protect people from communications offences committed by anonymous users.

The Bill already imposes duties on services to tackle illegal content. Those duties apply across all areas of a service, including the way it is designed and operated. Platforms will be required to take measures—for instance, changing the design of functionalities, algorithms, and other features such as anonymity—to tackle illegal content.

Ofcom is also required to ensure that user-to-user services are designed and operated to protect people from harm, including with regard to functionalities and other features relating to the operation of their service. This will likely include the use of anonymous accounts to commit offences in the scope of the Bill. My noble friend’s amendment is therefore not needed. I hope she will be satisfied not to press it, along with the other noble Lords who have amendments in this group.

My Lords, I would like to say that that was a rewarding and fulfilling debate in which everyone heard very much what they wanted to hear from my noble friend the Minister. I am afraid I cannot say that. I think it has been one of the most frustrating debates I have been involved in since I came into your Lordships’ House. However, it gave us an opportunity to admire the loftiness of manner that the noble Lord, Lord Clement-Jones, brought to dismissing my concerns about Wikipedia—that I was really just overreading the whole thing and that I should not be too bothered with words as they appear in the Bill because the noble Lord thinks that Wikipedia is rather a good thing and why is it not happy with that as a level of assurance?

I would like to think that the Minister had dealt with the matter in the way that I hoped he would, but I do thin, if I may say so, that it is vaguely irresponsible to come to the Dispatch Box and say, “I don’t think Wikipedia will qualify as a category 1 service”, and then refuse to say whether it will or will not and take refuge in the process the Bill sets up, when at least one Member of the House of Lords, and possibly a second in the shape of the noble Lord, Lord Clement-Jones, would like to know the answer to the question. I see a Minister from the business department sitting on the Front Bench with my noble friend. This is a bit like throwing a hand grenade into a business headquarters, walking away and saying, “It was nothing to do with me”. You have to imagine what the position is like for the business.

We had a very important amendment from my noble friend Lady Buscombe. I think we all sympathise with the type of abuse that she is talking about—not only its personal effects but its deliberate business effects, the deliberate attempt to destroy businesses. I say only that my reading of her Amendment 106 is that it seeks to impose on Ofcom an objective to prevent harm, essentially, arising from offences under Clauses 160 and 162 of the Bill committed by unverified or anonymous users. Surely what she would want to say is that, irrespective of verification and anonymity, one would want action taken against this sort of deliberate attempt to undermine and destroy businesses. While I have every sympathy with her amendment, I am not entirely sure that it relates to the question of anonymity and verification.

Apart from that, there were in a sense two debates going on in parallel in our deliberations. One was to do with anonymity. On that question, I think the noble Lord, Lord Clement-Jones, put the matter very well: in the end, you have to come down on one side or the other. My personal view, with some reluctance, is that I have come down on the same side as the Government, the noble Lord and others. I think we should not ban anonymity because there are costs and risks to doing so, however satisfying it would be to be able to expose and sue some of the people who say terrible and untrue things about one another on social media.

The more important debate was not about anonymity as such but about verification. We had the following questions, which I am afraid I do not think were satisfactorily answered. What is verification? What does it mean? Can we define what verification is? Is it too expensive? Implicitly, should it be available for free? Is there an obligation for it to be free or do the paid-for services count, and what happens if they are so expensive that one cannot reasonably afford them? Is it real, in the sense that the verification processes devised by the various platforms genuinely provide verification? Various other questions like that came up but I do not think that any of them was answered.

I hate to say this as it sounds a little harsh about a Government whom I so ardently support, but the truth is that the triple shield, also referred to as a three-legged stool in our debate, was hastily cobbled together to make up for the absence of legal but harmful, but it is wonky; it is not working, it is full of holes and it is not fit for purpose. Whatever the Minister says today, there has to be a rethink before he comes back to discuss these matters at the next stage of the Bill. In the meantime, I beg leave to withdraw my amendment.

Amendment 38 withdrawn.

Amendments 38A and 39 not moved.

Amendment 40

Moved by

40: Clause 12, page 12, line 27, after “to” insert “effectively”

Member’s explanatory statement

This amendment would bring this subsection into line with subsection (3) by requiring that the systems or processes available to users for the purposes described in subsections (7)(a) and (7)(b) should be effective.

Amendment 40 agreed.

Amendments 41 to 43ZA not moved.

Amendment 43A

Moved by

43A: Clause 12, page 13, line 20, leave out from “who” to end of line 21 and insert “—

(a) is an individual, whether in the United Kingdom or outside it, and(b) has not verified their identity to the provider of a service;”Member’s explanatory statement

This amendment makes it clear that the term “non-verified user” in clause 12 (user empowerment duties) refers to individuals and includes users outside the United Kingdom.

Amendment 43A agreed.

Amendments 44 and 45 not moved.

Clause 12, as amended, agreed.

Amendment 46

Moved by

46: After Clause 12, insert the following new Clause—

“Adult risk assessment duties

(1) This section sets out the duties about risk assessments in respect of adult users which apply in relation to Category 1 services.(2) A duty to carry out a suitable and sufficient adults’ risk assessment.(3) A duty to take appropriate steps to keep an adults’ risk assessment up to date, including when OFCOM make any significant change to a risk profile that relates to services of the kind in question.(4) Before making any significant change to any aspect of a service’s design or operation, a duty to carry out a further suitable and sufficient adults’ risk assessment relating to the impacts of that proposed change.(5) An “adults’ risk assessment” of a service of a particular kind means an assessment of the following matters, taking into account the risk profile that relates to services of that kind—(a) the user base;(b) the level of risk of adults who are users of the service encountering, by means of the service, each kind of content specified in section 12(10) to (12), taking into account (in particular) algorithms used by the service, and how easily, quickly and widely content may be disseminated by means of the service;(c) the level of risk of functionalities of the service, including user empowerment tools, which facilitate the presence, identification, dissemination, and likelihood of users encountering or being alerted to, content specified in section 12(10) to (12);(d) the extent to which user empowerment tools might result in interference with users’ right to freedom of expression within the law (see section 18);(e) how the design and operation of the service (including the business model, governance, use of proactive technology, measures to promote users’ media literacy and safe use of the service, and other systems and processes) may reduce or increase the risks identified.”Member’s explanatory statement

This and other amendments in the name of Baroness Stowell relate to risk assessments for adults in relation to platforms’ new duties to provide user empowerment tools. They would require platforms to provide public risk assessments in their terms of service and be transparent about the effect of user empowerment tools on users’ freedom of expression.

My Lords, in introducing this group, I will speak directly to the three amendments in my name—Amendments 46, 47 and 64. I will also make some general remarks about the issue of freedom of speech and of expression, which is the theme of this group. I will come to these in a moment.

The noble Lord, Lord McNally, said earlier that I had taken my amendments out of a different group— I hope from my introductory remarks that it will be clear why—but, in doing so, I did not realise that I would end up opening on this group. I offer my apologies to the noble Lord, Lord Stevenson of Balmacara, for usurping his position in getting us started.

I am grateful to the noble Baronesses, Lady Bull and Lady Featherstone, for adding their names. The amendments represent the position of the Communications and Digital Select Committee of your Lordships’ House. In proposing them, I do so with that authority. My co-signatories are a recent and a current member. I should add sincere apologies from the noble Baroness, Lady Featherstone, for not being here this evening. If she is watching, I send her my very best wishes.

When my noble friend Lord Gilbert of Panteg was its chair, the committee carried out an inquiry into freedom of speech online. This has already been remarked on this evening. At part of that inquiry, the committee concluded that the Government’s proposals in the then draft Bill—which may have just been a White Paper at that time—for content described as legal but harmful were detrimental to freedom of speech. It called for changes. Since then, as we know, the Government have dropped legal but harmful and instead introduced new user empowerment tools for adults to filter out harmful content. As we heard in earlier groups this evening, these would allow people to turn off or on content about subjects such as eating disorders and self-harm.

Some members of our committee might favour enhanced protection for adults. Indeed, some of my colleagues have already spoken in support of amendments to this end in other groups. Earlier this year, when the committee looked at the Bill as it had been reintroduced to Parliament, we agreed that, as things stood, these new user empowerment tools were a threat to freedom of speech. Whatever one’s views, there is no way of judging their impact or effectiveness—whether good or bad.

As we have heard already this evening, the Government have dropped the requirement for platforms to provide a public risk assessment of how these tools would work and their impact on freedom of speech. To be clear, for these user empowerment tools to be effective, the platforms will have to identify the content that users can switch off. This gives the platforms great power over what is deemed harmful to adults. Amendments 46, 47 and 64 are about ensuring that tech platforms are transparent about how they balance the principles of privacy, safety and freedom of speech for adults. These amendments would require platforms to undertake a risk assessment and publish a summary in their terms of service. This would involve them being clear about the effect of user empowerment tools on the users’ freedom of expression. Without such assessments, there is a risk that platforms would do either too much or too little. It would be very difficult to find out how they are filtering content and on what basis, and how they are addressing the twin imperatives of ensuring online safety without unduly affecting free speech.

To be clear, these amendments, unlike amendments in earlier groups, are neither about seeking to provide greater protection to adults nor about trying to reopen or revisit the question of legal but harmful. They are about ensuring transparency to give all users confidence about how platforms are striking the right balance. While their purpose is to safeguard freedom of speech, they would also bring benefits to those adults who wanted to opt in to the user empowerment tool because they would be able to assess what it was they were choosing not to see.

It is because of their twin benefits—indeed, their benefit to everyone—that we decided formally, as a committee, to recommend these amendments to the Government and for debate by your Lordships’ House. That said, the debate earlier suggests support for a different approach to enhancing protection for adults, and we may discover through this debate a preference for other amendments in this group to protect freedom of speech—but that is why we have brought these amendments forward.

I am now going to take off my Select Committee hat to say a few other remarks about freedom of expression—but I will not say very much, because I have the privilege of responding at the end. Indeed, there are noble Lords in the Chamber this evening who are far more steeped in this important principle of freedom of speech than me. I am keen to listen to what they have to say in order to judge to which of their amendments, if any, I will lend my support.

I should add that, perhaps unlike some other noble Lords who will speak on this group, I am about freedom of speech less as an end in itself and more as a means to a thriving democracy and healthy society. I have said on various public platforms over the last few months that I would have preferred the Bill to be about only child safety, so that we could learn before deciding what, if any, further steps to take—but we are where we are. What concerns me about the online world we now inhabit is in whose hands the power exists to decide what we get to see and debate. Who has the power to influence what is an acceptable opinion to hold? Who has the power to shape society, to such an extent that they can influence and change what we believe is right or wrong?

There is a real dilemma for me between the big tech platforms’ resistance to the responsibility that comes with being a publisher and us giving them that power and responsibility via the Bill. We will come back to the question of power and how we ensure that it is spread properly between Parliament, the Executive, the regulator and media platforms in a later group but, as we have decided to legislate for online safety, I want us to be as sure as we can be that we are not giving away political powers to individuals or institutions who have no democratic mandate or are not subject to suitable oversight. Freedom of speech and the clauses to which the amendments relate is why this is an important group.

I will make one final point before I sit down. Freedom of speech is also a critical element of the Digital Markets, Competition and Consumers Bill. That is why I have been so concerned that it was introduced alongside online safety. I am glad that it has finally arrived in Parliament and that we will get to examine it before too long. But that is for another day—for now, I beg to move.

My Lords, I have slightly abused my position because, as the noble Baroness has just said, this is a rather oddly constructed group. My amendments, which carve great chunks out of the Bill—or would do if I get away with it—do not quite point in the same direction as the very good speech the noble Baroness made, representing of course the view of the committee that she chairs so brilliantly. She also picked out one or two points of her own, which we also want to debate. It therefore might be easier if I just explain what I was trying to do in my amendments; then I will sit down and let the debate go, and maybe come back to pick up one or two threads at the end.

In previous Bills—and I have seen a lot of them—people who stand up and move clause stand part debates usually have a deeper and more worrying purpose behind the proposition. Either they have not read the Bill and are just trying to wing it, or they have a plan that is so complex and deep that it would probably need another whole Bill to sort it out. This is neither of those approaches; it is done because I want to represent the views mainly of the Joint Committee. We had quite a lot of debate in that committee about this area, beginning with the question about why the Bill—or the White Paper or draft Bill, at that stage—used the term “democratic importance” when many people would have used the parallel term “public interest” to try to reflect the need to ensure that matters which are of public good take place as a result of publication, or discussion and debate, or on online platforms. I am very grateful that the noble Lord, Lord Black, is able to be with us today. I am sure he will recall those debates, and hopefully he will make a comment on some of the work—and other members of the committee are also present.

To be clear, the question of whether Clauses 13, 14, 15 and 18 should stand part of the Bills is meant to release space for a new clause in Amendment 48. It is basically designed to try to focus the actions that are going to be taken by the Bill, and subsequently by the regulator, to ensure that the social media companies that are affected by, or in scope of, the Bill use, as a focus, some of the issues mainly related to “not taking down” and providing an appeal mechanism for journalistic material, whether that is provided by recognised news publishers or some other form of words that we can use, or it is done by recognised journalists. “Contentious” is an overused word, but all these terms are difficult to square away and be happy with, and therefore we should have the debate and perhaps reflect on that later when we come back to it.

The committee spent quite a lot of time on this, and there are two things that exercised our minds when we were working on this area. First, if one uses “content of democratic importance”, although it is in many ways quite a clever use of words to reflect a sensibility that you want to have an open and well-founded debate about matters which affect the health of our democracy, it can be read as being quite limiting. It is very hard to express—I am arguing against myself here—in the words of a piece of legislation what it is we are trying to get down to, but, during the committee’s recommendations, we received evidence that the definition of content of democratic importance was wider, or more capable of being interpreted as wider, than the scope the Government seem to have indicated. So there is both a good side and a bad side to this. If we are talking about content which is, or appears to be, specifically intended to contribute to the democratic political debate of the United Kingdom, or a part or area of the United Kingdom, we have got to ask the Minister to put on the record that this also inclusive of matters which perhaps initially do not appear necessarily to be part of it, but include public health, crime, justice, the environment, professional malpractice, the activities of large corporations and the hypocrisy of public figures when that occurs. I am not suggesting this is what we should be doing all the time, but these are things we often read about in our papers, and much the better off we are for it. However, if these things are not inclusive and not well rooted in the phrase “content of democratic importance”, it is up to the Government to come forward with a better way of expressing that, or perhaps in debate we can find it together.

I have some narrow questions. Are we agreed that what is currently in the Bill is intended specifically to contribute to democratic political debate, and is anything more needed to be said or done in order to make sure that happens? Secondly, the breadth of democratic political debate is obviously important; are there any issues here that are going to trip us up later when the Government come back and say, “Well, that wasn’t what we meant at all, and that doesn’t get covered, and therefore that stuff can be taken down, and that stuff there doesn’t have to be subject to repeal”? Are there contexts and subjects which we need to talk about? This is a long way into the question of content of democratic importance being similar or limited to matters that one recognises as relating to public interest. I think there is a case to be argued for the replacement of what is currently in the Bill with a way of trying to get closer to what we now recognise as being the standard form of debate and discussion when matters, which either the Government of the day or people individually do not like, get taken up and made the subject of legal discussion, because we do have discussions about whether or not it is in the public interest.

We probably do not know what that means. Therefore, a third part of my argument is that perhaps this is the point at which we try to define this, even though that might cause a lot of reaction from those currently in the press. In a sense, it is a question that needs to be resolved. Maybe this is or is not the right time to do that. Are the Government on the same page as the Joint Committee on this? Do they have an alternative and is this what they are trying to get across in the Bill?

Can we have a debate and discussion in relation to those things, making it clear that we want something in the Bill ensuring that vibrant political debate—the sort of things the noble Baroness was talking about on freedom of expression, but in a broader sense covering all the things that matter to the body politic, the people of this country—is not excluded by the Bill? That was the reason for putting down a raft of rather aggressive amendments. I hope it has been made clear that that was the case. I have other things that I would like to come back to, but I will probably do that towards the end of the debate. I hope that has been helpful.

My Lords, I will speak to the amendments in the name of the noble Baroness, Lady Stowell, to which I have added my name. As we heard, the amendments originally sat in a different group, on the treatment of legal content accessed by adults. Noble Lords will be aware from my previous comments that my primary focus for the Bill has been on the absence of adequate provisions for the protection of adults, particularly those who are most vulnerable. These concerns underpin the brief remarks I will make.

The fundamental challenge at the heart of the Bill is the need to balance protection with the right to freedom of expression. The challenge, of course, is how. The noble Baroness’s amendments seek to find that balance. They go beyond the requirements on transparency reporting in Clause 68 in several ways. Amendment 46 would provide a duty for category 1 services to maintain an up-to-date document for users of the service, ensuring that users understand the risks they face and how, for instance, user empowerment tools can be used to help mitigate these risks. It also provides a duty for category 1 services to update their risk assessments before making any “significant change” to the design or operation of their service. This would force category 1 services to consider the impact of changes on users’ safety and make users aware of changes before they happen, so that they can take any steps necessary to protect themselves and prepare for them. Amendment 47 provides additional transparency by providing a duty for category 1 services to release a public statement of the findings of the most recent risk assessment, which includes any impact on freedom of expression.

The grouping of these amendments is an indication, if any of us were in doubt, of the complexity of balancing the rights of one group against the rights of another. Regardless of the groupings, I hope that the Minister takes note of the breadth and depth of concerns, as well as the willingness across all sides of the Committee to work together on a solution to this important issue.

My Lords, I put my name to Amendment 51, which is also in the name of the noble Lords, Lord Stevenson and Lord McNally. I have done so because I think Clause 15 is too broad and too vague. I declare an interest, having been a journalist for my entire career. I am currently a series producer of a series of programmes on Ukraine.

This clause allows journalism on the internet to be defined simply as the dissemination of information, which surely covers all posts on the internet. Anyone can claim that they are a journalist if that is the definition. My concern is that it will make a nonsense of the Bill if all content is covered as journalism.

I support the aims behind the clause to protect journalism in line with Article 10. However, I am also aware of the second part of Article 10, which warns that freedom of speech must be balanced by duties and responsibilities in a democratic society. This amendment aims to hone the definition of journalism to that which is in the public interest. In doing so, I hope it will respond to the demands of the second part of Article 10.

It has never been more important to create this definition of journalism in the public interest. We are seeing legacy journalism of newspapers and linear television being supplanted by digital journalism. Both legacy and new journalism need to be protected. This can be a single citizen journalist, or an organisation like Bellingcat, which draws on millions of digital datapoints to create astonishing digital journalism to prove things such as that Russian separatist fighters shot down flight MH17 over Ukraine.

The Government’s view is that the definition of “in the public interest” is too vague to be useful to tech platforms when they are systematically filtering through possible journalistic content that needs to be protected. I do not agree. The term “public interest” is well known to the courts from the Defamation Act 2013. The law covers the motivation of a journalist, but does not go on to define the content of journalism to prove that it is in the public interest.

Surely what defines the public interest in journalism is proof that a process has been followed to ensure the accuracy and fairness of the information purveyed. A journalist using a public interest defence would show that they have checked the facts for accuracy by using authoritative or verifiable sources for their information. But, if the Government will not accept this definition and say that it is too hard to define “public interest”, the response should be to look at the laws that do that.

I ask the Committee to look at the public interest tests put forward by the Information Commissioner’s Office when deciding whether to grant a freedom of information request. They require the content to “promote public understanding” and safeguard the democratic process, uphold “standards of integrity”, ensure “justice and fair treatment” for all, and ensure the “best use” of public resources.

This is not an extensive list of the criteria that can be used to define “public interest”, so I also suggest that the Minister looks at the Public Interest Disclosure Act 1998, which aims to protect employees from unfair dismissal due to whistleblowing. It goes further in trying to define the disclosures that might be protected because they are in the public interest: a request should ensure that the information disclosed will reveal

“that a criminal offence has been committed, … that a person has failed … to comply with any … legal obligation to which he is subject, … that a miscarriage of justice has occurred, … that the health or safety of any individual has been … endangered”,


“that the environment has been … or is likely to be damaged”.

These definitions can be built on or worked through. Both Acts show that Parliament has successfully accepted the concept of the public interest defence and defined it, albeit in a limited way.

This amendment would ensure that category 1 services protect journalism in the public interest. This is not same as the powerful exemption offered to content provided by news publishers in Clause 50, which are defined by a clear set of criteria. Under Amendment 51, the journalism covered in Clause 15 would not have to belong to a regulator to qualify as being in the public interest; the author just has to prove that they have acted responsibly to deliver accurate and verifiable journalism. This would not stop disinformation appearing on the internet—which should be allowed to continue so that it can be refuted—but it would ensure that it does not benefit from the protection offered by Clause 15.

The Bill changes for ever the controversy about whether the platforms are publishers. Companies come in the scope of the Bill as publishers, and, as such, should have the ability to distinguish content that is accurate and fair public interest journalism and, as Clause 15(2) says, create a service

“using proportionate systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account”.

I am a great supporter of freedom of expression, and I am glad that the Bill contains protections for that. However, if category 1 companies will be asked to provide this protection, it has to be less vague and more defined. This amendment offers some way towards an answer.

My Lords, at the beginning of Committee, I promised that I would speak only twice, and this is the second time. I hope that noble Lords will forgive me if I stray from the group sometimes, but I will be as disciplined as I can. I will speak to Amendments 57 and 62, which the noble Baroness, Lady Featherstone, and I tabled. As others have said, the noble Baroness sends her apologies; sadly, she has fractured her spine, and I am sure we all wish her a speedy recovery. The noble Baroness, Lady Fox, has kindly added her name to these amendments.

As I have said, in a previous role, as a research director of a think tank—I refer noble Lords to my registered interests—I became interested in the phenomenon of unintended consequences. As an aside, it is sometimes known as the cobra effect, after an incident during the colonial rule of India, when a British administrator of Delhi devised a cunning plan to rid the city of dangerous snakes. It was simple: he would pay local residents a bounty for each cobra skin delivered. What could possibly go wrong? Never slow to exploit an opportunity, enterprising locals started to farm cobras as a way of earning extra cash. Eventually, the authorities grew wise to this, and the payments stopped. As a result, the locals realised that the snakes were now worthless and released them into the wild, leading to an increase, rather than a decrease, in the population of cobras.

As with the cobra effect, there have been many similar incidents of well-intentioned acts that have unintentionally made things worse. So, as we try to create a safer online space for our citizens, especially children and vulnerable adults, we should try to be as alert as we can to unintended consequences. An example is encrypted messages, which I discussed in a previous group. When we seek access to encrypted messages in the name of protecting children in this country, we should be aware that such technology could lead to dissidents living under totalitarian regimes in other countries being compromised or even murdered, with a devastating impact on their children.

We should also make sure that we do not unintentionally erode the fundamental rights and freedoms that underpin our democracy, and that so many people have struggled for over the centuries. I recognise that some noble Lords may say that that is applicable to other Bills, but I want to focus specifically on the implications for this Bill. In our haste to protect, we may create a digital environment and marketplace that stifles investment and freedom of expression, disproportionately impacting marginalised communities and cultivating an atmosphere of surveillance. The amendments the noble Baroness and I have tabled are designed to prevent such outcomes. They seek to strike a balance between regulating for a safer internet and preserving our democratic values. As many noble Lords have rightly said, all these issues will involve trade-offs; we may disagree, but I hope we will have had an informed debate, regardless of which side of the argument we are on.

We should explicitly outline the duties that service providers and regulators have with respect to these rights and freedoms. Amendment 57 focuses on safe- guarding specific fundamental rights and freedoms for users of regulated user-to-user services, including the protection of our most basic human rights. We believe that, by explicitly stating these duties, rather than hoping that they are somehow implied, we will create a more comprehensive framework for service providers to follow, ensuring that their safety policies and procedures do not undermine the essential rights of users, with specific reference to

“users with protected characteristics under the Equality Act 2010”.

Amendment 62 focuses on the role of Ofcom in mitigating risks to freedom of expression. I recognise that there are other amendments in this group on that issue. It is our responsibility to ensure that the providers of regulated user-to-user services are held accountable for their content moderation and recommender systems, to ensure they do not violate our freedoms.

I want this Bill to be a workable Bill. As I have previously said, I support the intention behind it to protect children and vulnerable adults, but as I have said many times, we should also be open about the trade-off between security and protection on the one hand, and freedom of expression on the other. My fear is that, without these amendments, we risk leaving our citizens vulnerable to the unintended consequences of overzealous content moderation, biased algorithms and opaque decision-making processes. We should shine a light on and bring transparency to our new processes, and perhaps help guide them by being explicit about those elements of freedom of speech we wish to preserve.

It is our duty to ensure that the Online Safety Bill not only protects our citizens from harm but safeguards the principles that form the foundation of a free and open society. With these amendments, we hope to transcend partisan divides and to fortify the essence of our democracy. I hope that we can work together to create an online environment that is safe, inclusive and respectful of the rights and freedoms that the people of this country cherish. I hope that other noble Lords will support these amendments, and, ever the optimist, that my noble friend the Minister will consider adopting them.

My Lords, it is a great pleasure to follow the noble Lord, Lord Kamall, who explained well why I put my name to the amendments. I extend my regards to the noble Baroness, Lady Featherstone; I was looking forward to hearing her remarks, and I hope that she is well.

I am interested in free speech; it is sort of my thing. I am interested in how we can achieve a balance and enhance the free speech rights of the citizens of this country through the Bill—it is what I have tried to do with the amendments I have supported—which I fear might be undermined by it.

I have a number of amendments in this group. Amendment 49 and the consequential Amendments 50 and 156 would require providers to include in their terms of service

“by what method content present on the service is to be identified as content of democratic importance”,

and bring Clause 13 in line with Clauses 14 and 15 by ensuring an enhanced focus on the democratic issue.

Amendment 53A would provide that notification is given

“to any user whose content has been removed or restricted”.

It is especially important that the nature of the restriction in place be made clear, evidenced and justified in the name of transparency and—a key point—that the user be informed of how to appeal such decisions.

Amendment 61 in my name calls for services to have

“proportionate systems, processes and policies designed to ensure that as great a weight is given to users’ right to freedom of expression ... as to safety when making decisions”

about whether to take down or restrict users access to the online world, and

“whether to take action against a user generating, uploading or sharing content”.

In other words, it is all about applying a more robust duty to category 1 service providers and emphasising the importance of protecting

“a wide diversity of political, social, religious and philosophical opinion”


I give credit to the Government, in that Clause 18 constitutes an attempt by them in some way to balance the damage to individual rights to freedom of expression and privacy as a result of the Bill, but I worry that it is a weak duty. Unlike operational safety duties, which compel companies proactively to prevent or minimise so-called harm in the way we have discussed, there is no such attempt to insist that freedom of speech be given the same regard or importance. In fact, there are worries that the text of the Bill has downgraded speech and privacy rights, which the Open Rights Group says

“are considered little more than a contractual matter”.

There has certainly been a lot of mention of free speech in the debates we have had so far in Committee, yet I am not convinced that the Bill gives it enough credit, which is why I support the explicit reference to it by the noble Lord, Lord Kamall.

I have a lot of sympathy with the amendments of the noble Lord, Lord Stevenson, seeking to replace Clauses 13, 14, 15 and 18 with a single comprehensive duty, because in some ways we are scratching around. That made some sense to me and I would be very interested to hear more about how that might work. Clauses 13, 14, 15 and 18 state that service providers must have regard to the importance of protecting users’ rights to freedom of expression in relation to

“content of democratic importance ... publisher content ... journalistic content”.

The very existence of those clauses, and the fact that we even need those amendments, is an admission by the Government that elsewhere, free speech is a downgraded virtue. We need these carve-outs to protect these things, because the rest of the Bill threatens free speech, which has been my worry from the start.

My Amendment 49 is a response to the Bill’s focus on protecting “content of democratic importance”. I was delighted that this was included, and the noble Lord, Lord Stevenson of Balmacara, has raised a lot of the questions I was asking. I am concerned that it is rather vaguely drawn, and too narrow and technocratic—politics with a big “P”, rather than in the broader sense. There is a lot that I would consider democratically important that other people might see, especially given today’s discussion, as harmful or dangerous. Certainly, the definition should be as broad as possible, so my amendment seeks to write that down, saying that it should include

“political, social, religious and philosophical opinion”.

That is my attempt to broaden it out. It is not perfect, I am sure, but that is the intention.

I am also keen to understand why Clauses 14 and 15, which give special protection to news publisher and journalistic content, have enhanced provisions, including an expedited appeals process for the reinstatement of removed materials, but those duties are much weaker—they do not exist—in Clause 13, which deals with content of democratic importance. In my amendment, I have suggested that they are levelled up.

My Amendment 61 attempts to tackle the duties that will be used for companies in terms of safety, which is the focus of the Bill. It stresses that equal weight should be given to free speech and to safety. This relates to the content of democratic importance that I have just been talking about, because I argue that democracy is not safe if we do not proactively promote freedom. Both those amendments try to ensure that companies act to remove philosophical, religious, democratic and social material only in extremis—as an exception, not the rule—and that they always have free speech at the forefront.

On the issue of how we view content of democratic importance, one thing has not been stressed in our discussions so far. We should note that the right to freedom of expression is not just about defending the ability of individuals to speak or impart information; it is also the right of the public to receive information and the freedom to decide what they find useful or second-rate and what they want to watch or listen to. It is not just the right to post opinions but the right of others to have access to diverse opinions and postings; that kind of free flow of information is the very basis of our democracy. In my view, despite its talk of user controls and user empowerment, the Bill does not allow for that or take it into account enough.

It is very important, therefore, that users are told if their posts are restricted, how they are restricted and how they can appeal. That is the focus of Amendment 53A. The EHRC says that the Bill overall lacks a robust framework for individuals to appeal platforms’ decisions or to seek redress for unjustified censorship. I think that needs to be tackled. Clause 19 has a basic complaints procedure, but my amendment to Clause 17 tries to tackle what is a very low bar by stressing the need for “evidenced justification” and details on how to appeal. Users need to know exactly why there has been a decision to restrict or remove. That is absolutely crucial.

Ofcom is the enforcer in all this, with the Secretary of State of the day being given a plethora of new delegated powers, which I think we need to be concerned about. As the coalition group Legal to Say, Legal to Type notes, the Bill in its current form gives extensive powers to the Secretary of State and Ofcom:

“This would be the first time since the 1600s that written speech will be overseen by the state in the UK”.

The truth is that we probably need a new Milton, but in 2023 what we have instead is a Moylan. I have put my name to a range of the excellent series of amendments from the noble Lord, Lord Moylan, including Amendments 102, 191 and 220, all dealing with Ofcom and the Secretary of State. As he will explain, it is really crucial that we take that on.

I did not put my name to the noble Lord’s Amendment 294, although I rather wish I had. In some ways this is a key amendment, as it would leave out the word “psychological” from the definition of harm. As we have gone through all these discussions so far in Committee and at Second Reading and so on, the definition of harm is something that, it seems to me, is very slippery and difficult. People just say, “We have to remove harmful content” or, “It is okay to remove harmful content”, but it is not so simple.

I know that any philosophical rumination is frowned upon at this stage—I was told off for it the other day—but, as this is the 150th anniversary of JS Mill’s death, let me note that his important harm principle has been somewhat bastardised by an ever-elastic concept of harm.

Psychological harm, once added into the mix—I spoke about this before—is going to lead to the over-removal of lawful content, because what counts as harm is not settled online or offline. There is no objective way of ascertaining whether emotional or psychological harm has occurred. Therefore, it will be impossible to determine whether service providers have discharged their duties. Controversies of interpretation about what is harmful have already left the door open to activist capture, and this concept is regularly weaponised to close down legitimate debate.

The concept of harm, once expanded to include psychological harm, is subject to concept creep and subjectivity. The lack of definition was challenged by the Lords Communications and Digital Committee when it wrote to the Secretary of State asking whether psychological harm had any objective clinical basis. DCMS simply confirmed that it did not, yet psychological harm is going to be used as a basis for removing lawful speech from the online world. That can lead only to a censorious and, ironically, more toxic online environment, with users posting in good faith finding their access to services—access that is part of the democratic public square—being shut down temporarily or permanently, even reported to the law or what have you, just because they have been accused of causing psychological harm. The free speech elements of the Bill need to be strengthened enormously.

My Lords, my Amendment 63 is about the meaning of words. It was an interesting feature of the speech made by the noble Baroness, Lady Fox of Buckley, which we have just had the pleasure of listening to, that she slipped from time to time from the phrase “freedom of expression” to “freedom of speech”. That is not a criticism; it is very easy for one to treat these expressions as meaning the same thing. Others in this debate have done the same thing. I think that the noble Baroness, Lady Stowell, used “freedom of speech” sometimes, as well as “freedom of expression”. It is not a criticism; it is just a fact that we tend to treat the two the same.

However, the Government in Clause 18 have chosen to use the words

“freedom of expression within the law”.

My amendment draws attention to that feature. If we work our way through Clause 18, its purpose is to set out the duties about freedom of expression and privacy that are to apply in relation to the user-to-user services referred to in that clause. Clause 18(2) imposes on those providing user-to-user services

“a duty to have particular regard to the importance of protecting users’ right to freedom of expression within the law”

when deciding on and implementing safety measures and policies. Clause 18(8) provides a definition of the phrase “safety measures and policies”, which

“means measures and policies designed to secure compliance with any of the duties set out”

in previous clauses of the Bill. These extend to illegal content, to children’s online safety, to user empowerment, to content reporting relating to illegal content and content that is likely to be harmful to children, and to complaints procedures. So a balance has to be struck between giving effect to the right to freedom of expression within the law and performing the important duties referred to in the clause. As Clause 18(4) explains, when decisions are being taken about the safety measures and policies that are to be introduced or applied, there must be an assessment of the impact that they would have on the user’s right to freedom of expression within the law.

My amendment was prompted by a point made by the Constitution Committee, of which I am a member, in its report on the Bill. It suggested that the House might wish to consider whether, in the interests of legal certainty, the expression “freedom of expression” should also be defined for the purposes of this clause.

The committee referred to the fact that in its report on the on the Higher Education (Freedom of Speech) Bill, it recommended that that Bill should define the expression “freedom of speech”, which is what that Bill was talking about, by referring to Article 10 of the European Convention on Human Rights. I raised this issue by proposing an amendment to that effect in Committee on that Bill. On Report, a government amendment to achieve that was agreed to and, in due course, it was also agreed by the House of Commons. My Amendment 63 adopts the same wording as that used in the Higher Education (Freedom of Speech) Bill, and I suggest that it should be adopted here, too, in the interests of consistency and to provide the desirable element of legal certainty.

Although it appears in a different group, I think it is worth referring to Amendment 58 in the names of the noble Baroness, Lady Fraser of Craigmaddie, and the noble Lord, Lord Foulkes of Cumnock. It proposes the insertion of the words

“as defined under the Human Rights Act 1998 and its application to the United Kingdom”,

so it is making the same point and an additional one, which is this. We have to be very careful in this Bill to recognise that it extends to all parts of the United Kingdom, particularly in regard to the devolved Administrations in Scotland, Wales and Northern Ireland. Scotland is very active in promoting legislation dealing with matters of this kind, and it is rather important that we should define in the Bill what is meant by

“freedom of expression within the law”

in its application throughout the United Kingdom, lest there should be any doubt as to what it might mean in the other parts of this country—particularly, if I may say so, with regard to Scotland. The noble Baroness, Lady Fraser, may say more about this at this stage, although her amendment is in a different group, because it is very pertinent to the point I am trying to make about the need for a definition in Clause 18.

That is the reasoning behind the amendment, and I come back to the interesting feature that one tends to mix the expressions “freedom of speech” and “freedom of expression”, but it is important to anchor exactly why the Government chose to use the words

“freedom of expression within the law”

for the purposes of this clause.

My Lords, I hung back in the hope that the noble and learned Lord, Lord Hope of Craighead, would speak before me, because I suspected that his remarks would help elucidate my amendments, as I believe they have. I have a large number of amendments in this group, but all of them, with one exception, work together as, effectively, a single amendment. They are Amendments 101, 102, 109, 112, 116, 121, 191 and 220. The exception is Amendment 294, to which the noble Baroness, Lady Fox of Buckley, alluded and to which I shall return in a moment.

Taking that larger group of amendments first, I can describe their effect relatively briefly. In the Bill, there are requirements on services to consider how their practices affect freedom of expression, but there is no equivalent explicit duty on the regulator, Ofcom, to have regard to freedom of expression.

These amendments, taken together, would require Ofcom to

“have special regard to freedom of expression”

within the law when designing codes of practice, writing guidance and undertaking enforcement action. They would insert a new clause requiring Ofcom to have special regard to rights to freedom of expression within the law in preparing a code of practice; they would also require Ofcom, when submitting a draft code to the Secretary of State, to submit a statement setting out it had complied with the duty imposed by that new requirement; and they would require the Secretary of State to submit that statement to Parliament when laying a draft code before Parliament. They would impose similar obligations on Ofcom and the Secretary of State when making amendments to codes that might be made later. Finally, they would have a similar effect relating to guidance issued by Ofcom.

It is so glaringly obvious that Ofcom should be under this duty that it must be a mere omission that the balancing, corresponding duty has not been placed on it that has been placed on the providers. I would hope, though experience so far in Committee does not lead me to expect, that my noble friend would accept this, and that it would pass relatively uncontroversially.

I will say no more about it, except to make one slightly more reflective comment—and here I am very conscious of speaking in the presence of the noble and learned Lord, Lord Hope of Craighead, who is perfectly entitled to correct me if I stray. There has been a great deal of comment from the Front Bench and from other parts of the Committee about how the Bill has to balance freedom of expression with safety, and inevitably such a balance is required. But in any such balance, the scales have to be tipped in favour of freedom of expression, because freedom of expression is a human right in the European Convention on Human Rights.

It is true of course that the second part of Article 10 allows it to be mitigated in some ways, but the starting point has to be the first clause of Article 10, which states that freedom of expression stands as a fundamental human right. Every abridgement of it has to be justified individually in relation to the second part; it is not enough to say that the two are somehow equal and that we have to find a balance that is purely prudential or that fits in with our notions of common sense or good judgment. There is a weighting in that balance, and that weighting is in favour of freedom of expression. So, I would strongly encourage noble Lords to bear that in mind, and I hope that this relatively simple proposal will find widespread acceptance.

I come now to Amendment 294, which is completely different but relates to this question of the definition of harm. As the noble Baroness, Lady Fox of Buckley, said, harm is defined very loosely and vaguely in the Bill—it is defined simply as “physical or psychological harm”, which is a self-referential definition and expands it somewhat.

I think we all understand what might be meant by “physical harm”, but, when it comes to “psychological harm”, I could understand a definition that had a basis in medical science. Perhaps the right word for such a definition would be “psychiatric harm”; I could understand that because medical science has some objective basis to it. But when one finds the words “psychological harm” being used, and when the department confirms that there is no objective basis for it, one is effectively opening the door to talking about “feelings”.

I know of course that there are genuine psychological harms which give great concern to Members of this Committee, including myself. Psychological harms that lead to eating disorders are a good example, and I understand that; I am not trying to trivialise psychological harms. This amendment is a probing amendment; it is trying to find out what the Government mean and what boundaries, if any, they set to their understanding of the term “psychological”. If there are no boundaries, it really does extend to “feelings”, because that is how the term is increasingly used, especially among the young—and that is a very loose definition.

So, in probing the Government on what they mean by “psychological harm”, I hope to have something hard and solid coming back from them that we know sets some limits to where this can take us.

My Lords, this is my first opportunity to speak in Committee on this important Bill, but I have followed it very closely, and the spirit in which constructive debate has been conducted has been genuinely exemplary. In many ways, it mirrors the manner in which the Joint Committee, on which I had the privilege to serve with other noble Lords, was conducted, and its report rightly has influenced our proceedings in so many ways. I declare an interest as deputy chairman of Telegraph Media Group, which is a member of the News Media Association, and a director of the Regulatory Funding Company, and note my other interests as set out in the register.

I will avoid the temptation to ruminate philosophically, as the noble Baroness, Lady Fox, entertained us by doing. I will speak to Amendment 48, in the name of the noble Lord, Lord Stevenson of Balmacara, and the other amendments which impact on the definition of “recognised news publisher”. As the noble Lord said, his amendments are pretty robust in what they seek to achieve, but I am very pleased that he has tabled them, because it is important that we have a debate about how the Bill impacts on freedom of expression—I use that phrase advisedly—and press and media freedom. The noble Lord’s aims are laudable but do not quite deliver what he intends.

I will explain why it is important that Clauses 13 and 14 stand part of the Bill, and without amendments of the sort proposed. The Joint Committee considered this issue in some detail and supported the inclusion of the news publisher content exemption. These clauses are crucial to the whole architecture of the Bill because they protect news publishers from being dragged into an onerous regime of statutory content control. The press—these clauses cover the broadcasters too—have not been subject to any form of statutory regulation since the end of the 17th century. That is what we understand by press freedom: that the state and its institutions do not have a role in controlling or censoring comment. Clauses 13 and 14 protect that position and ensure that the media, which is of course subject to rigorous independent standard codes as well as to criminal and civil law, does not become part of a system of state regulation by the back door because of its websites and digital products.

That is what is at the heart of these clauses. However, it is not a carte blanche exemption without caveats. As the Joint Committee looked at, and as we have heard, to qualify for it, publishers must meet stringent criteria, as set out in Clause 50, which include being subject to standards codes, having legal responsibility for material published, having effective policies to handle complaints, and so on. It is exactly the same tough definition as was set out in the National Security Bill, which noble Lords across the House supported when it was on Report here.

Without such clear definitions, alongside requirements not to take down or restrict access to trusted news sources without notification, opaque algorithms conjured up in Silicon Valley would end up restricting the access of UK citizens to news, with scant meaningful scope for reinstating it given the short shelf life of news. Ultimately, that would have a profound impact on the public’s right to access news, something which the noble Baroness rightly highlighted. That is why the Joint Committee recommended, at paragraph 304 of its report, that the Bill was

“strengthened to include a requirement that news publisher content should not be moderated, restricted or removed unless it is content the publication of which clearly constitutes a criminal offence, or which has been found to be unlawful by order of a court within the appropriate jurisdiction”.

The Government listened to that concern that the platforms would put themselves in the position of censor on issues of democratic importance, and quite rightly amended the draft Bill to deal with that point. Without it, instead of trusted, curated, regulated news comment, from the BBC to the Guardian to the Manchester Evening News, news would end up being filtered by Google and Facebook. That would be a crushing blow to free speech, to which all noble Lords are absolutely committed.

So, instead of these clauses acting as a bulwark against disinformation by protecting content of democratic importance, they would weaken the position of trusted news providers by introducing too much ambiguity into the system. As we all know, ambiguity brings with it legal challenge and constant controversy. This is especially so given that the exemptions that we are talking about already exist in statute elsewhere, which would cause endless confusion.

I understand the rationale behind many of the amendments, but I fear they would not work in practice. Free speech—and again I use the words advisedly—is a very delicate bloom, which can easily be swept away by badly drafted, uncertain or opaque laws. Its protection needs certainty, which is what the Bill, as it stands, provides. A general catch-all clause would be subject, I fear, to endless argument with the platforms, which are well known for such tactics and for endless legal wrangling.

I noted the remarks of the noble Lord, Lord Stevenson of Balmacara, in his superb speech on the opening day in Committee, when he said that one issue with the Bill is that it

“is very difficult to understand, in part because of its innate complexity and in part because it has been revised so often”. [Official Report, 19/4/23; col. 700.]

He added, in a welcome panegyric to clarity and concision, that given that it is a long and complex Bill, why would we add to it? I agree absolutely with him, but those are arguments for not changing the Bill in the way he proposes. I believe the existing provisions are clear and precise, practical and carefully calibrated. They do not leave room for doubt, and protect media freedom, investigative journalism and the citizen’s right to access authoritative news, which is why I support the Bill as it stands.

My Lords, given the lateness of the hour, I will make just three very brief points. The first is that I find it really fascinating that the amendments in the name of the noble Baroness, Lady Stowell, come from a completely different perspective, but still demand transparency over what is going on. I fully support the formation that she has found, and I think that in many ways they are better than the other ones which came from the other perspective. But what I urge the Minister to hear is that we all seek transparency over what is going on.

Secondly, in many of the amendments—I think I counted about 14 or 15 in the name of the noble Lord, Lord Moylan, and also of the noble Lord, Lord Kamall—there is absolutely nothing I disagree with. My problem with these amendments really goes back to the debate we had on the first day on Amendment 1, in the name of the noble Lord, Lord Stevenson. He set out the purposes of the Bill, and the Minister gave what was considered by most Members of your Lordships’ House to be the groundwork of a very excellent alternative, in the language of government. It appears, as we go on, that many dozens of amendments could be dropped in favour of this purposive clause, which itself could include reference to human rights, children’s rights, the Equality Act, the importance of freedom of expression under the law, and so on. I urge the Minister to consider the feeling of the House: that the things said at the Dispatch Box to be implicit, again and again, the House requires to be explicit. This is one way we could do it, in short form, as the noble Lord, Lord Black, just urged us.

Thirdly, I do have to speak against Amendment 294. I would be happy to take the noble Lord, Lord Moylan, through dozens of studies that show the psychological impact of online harms: systems that groom users to gamble, that reward them for being online at any cost to their health and well-being, that profile them to offer harmful material, and more of the same whether they ask for it or not, and so on. I am also very happy to put some expert voices at his disposal, but I will just say this: the biggest clue as to why this amendment is wrongheaded is the number of behavioural psychologists that are employed by the tech sector. They are there, trying to get at our behaviours and thoughts; they anticipate our move and actually try to predict and create the next move. That is why we have to have psychological harm in the Bill.

I will not detain noble Lords very long either. Two things have motivated me to be involved in this Bill. One is protection for vulnerable adults and the second is looking at this legislation with my Scottish head on, because nobody else seems to be looking at it from the perspective of the devolved Administrations.

First, on protection for vulnerable adults, we have already debated the fact that in an earlier iteration of this Bill, there were protections. These have been watered down and we now have the triple shield. Whether they fit here, with the amendment from my noble friend Lady Stowell, or fit earlier, what we are all asking for is the reinstatement of risk assessments. I come at this from a protection of vulnerable groups perspective, but I recognise that others come at it from a freedom of expression perspective. I do not think the Minister has answered my earlier questions. Why have risk assessments been taken out and why are they any threat? It seems to be the will of the debate today that they do nothing but strengthen the transparency and safety aspects of the Bill, wherever they might be put.

I speak with trepidation to Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead. I flatter myself that his amendment and mine are trying to do a similar thing. I will speak to my amendment when we come to the group on devolved issues, but I think what both of us are trying to establish is, given that the Bill is relatively quiet on how freedom of expression is defined, how do platforms balance competing rights, particularly in the light of the differences between the devolved Administrations?

The Minister will know that the Hate Crime and Public Order (Scotland) Act 2021 made my brain hurt when trying to work out how this Bill affects it, or how it affects the Bill. What is definitely clear is that there are differences between the devolved Administrations in how freedom of expression is interpreted. I will study the noble and learned Lord’s remarks very carefully in Hansard; I need a little time to think about them. I will listen very carefully to the Minister’s response and I look forward to the later group.

My Lords, I too will be very brief. As a member of the Communications and Digital Committee, I just wanted to speak in support of my noble friend Lady Stowell of Beeston and her extremely powerful speech, which seems like it was quite a long time ago now, but it was not that long. I want to highlight two things. I do not understand how, as a number of noble Lords have said, having risk assessments is a threat to freedom of expression. I think the absolute opposite is the case. They would enhance all the things the noble Baroness, Lady Fox, is looking to see in the Bill, just as much as they would enhance the protections that my noble friend, who I always seem to follow in this debate, is looking for.

Like my noble friend, I ask the Minister: why not? When the Government announced the removal of legal but harmful and the creation of user empowerment tools, I remember thinking—in the midst of being quite busy with Covid—“What are user empowerment tools and what are they going to empower me to do?” Without a risk assessment, I do not know how we answer that question. The risk is that we are throwing that question straight to the tech companies to decide for themselves. A risk assessment provides the framework that would enable user empowerment tools to do what I think the Government intend.

Finally, I too will speak against my noble friend Lord Moylan’s Amendment 294 on psychological harm. It is well documented that tech platforms are designed to drive addiction. Addiction can be physiological and psychological. We ignore that at our peril.

My Lords, it is a pleasure to have been part of this debate and to have heard how much we are on common ground. I very much hope that, in particular, the Minister will have listened to the voices on the Conservative Benches that have very powerfully put forward a number of amendments that I think have gained general acceptance across the Committee.

I fully understand the points that the noble Lord, Lord Black, made and why he defends Clause 14. I hope we can have a more granular discussion about the contents of that clause rather than wrap it up on this group of amendments. I do not know whether we will be able to have that on the next group.

I thank the noble Baroness, Lady Stowell, for putting forward her amendment. It is very interesting, as the noble Baronesses, Lady Bull and Lady Fraser, said, that we are trying to get to the same sort of mechanisms of risk assessment, perhaps out of different motives, but we are broadly along the same lines and want to see them for adult services. We want to know from the Minister why we cannot achieve that, basically. I am sure we could come to some agreement between us as to whether user empowerment tools or terms of service are the most appropriate way of doing it.

We need to thank the committee that the noble Baroness chairs for having followed up on the letter to the Secretary of State for DCMS, as was, on 30 January. It is good to see a Select Committee using its influence to go forward in this way.

The amendments tabled by the noble Lord, Lord Kamall, and supported by my noble friend Lady Featherstone—I am sorry she is unable to be here today, as he said—are important. They would broaden out consideration in exactly the right kind of way.

However, dare I say it, probably the most important amendment in this group is Amendment 48 in the name of the noble Lord, Lord Stevenson. Apart from the Clause 14 stand part notice, it is pretty much bang on where the Joint Committee got to. He was remarkably tactful in not going into any detail on the Government’s response to that committee. I will not read it out because of the lateness of the hour, but the noble Viscount, Lord Colville, got pretty close to puncturing the Government’s case that there is no proper definition of public interest. It is quite clear that there is a perfectly respectable definition in the Human Rights Act 1998 and, as the noble Viscount said, in the Defamation Act 2013, which would be quite fit for purpose. I do not quite know why the Government responded as they did at paragraph 251. I very much hope that the Minister will have another look at that.

The amendment from the noble and learned Lord, Lord Hope, which has the very respectable support of Justice, is also entirely apposite. I very much hope that the Government will take a good look at that.

Finally, and extraordinarily, I have quite a lot of sympathy with the amendments from the noble Lord, Lord Moylan. It was all going so well until we got to Amendment 294; up to that point I think he had support from across the House, because placing that kind of duty on Ofcom would be a positive way forward.

As I say, getting a clause of the kind that the noble Lord, Lord Stevenson, has put forward, with that public interest content point and with an umbrella duty on freedom of expression, allied to the definition from the noble and learned Lord, Lord Hope, would really get us somewhere.

Lawyers—don’t you love them? How on earth are we supposed to unscramble that at this time of night? It was good to have my kinsman, the noble and learned Lord, Lord Hope, back in our debates. We were remarking only a few days ago that we had not seen enough lawyers in the House in these debates. One appears, and light appears. It is a marvellous experience.

I thank the Committee for listening to my earlier introductory remarks; I hope they helped to untangle some of the issues. The noble Lord, Lord Black, made it clear that the press are happy with what is in the current draft. There could be some changes, and we have heard a number of examples of ways in which one might either top or tail what there is.

There was one question that perhaps he could have come back on, and maybe he will, as I have raised it separately with the department before. I agree with a lot of what he said, but it applies to a lot more than just news publishers. Quality journalism more generally enhances and restores our faith in public services in so many ways. Why is it only the news? Is there a way in which we could broaden that? If there is not this time round, perhaps that is something we need to pick up later.

As the noble Lord, Lord Clement-Jones, has said, the noble Viscount, Lord Colville, made a very strong and clear case for trying to think again about what journalism does in the public realm and making sure that the Bill at least carries that forward, even if it does not deal with some of the issues that he raised.

We have had a number of other good contributions about how to capture some of the good ideas that were flying around in this debate and keep them in the foreground so that the Bill is enhanced. But I think it is time that the Minister gave us his answers.

I join noble Lords who have sent good wishes for a speedy recovery to the noble Baroness, Lady Featherstone.

Amendments 46, 47 and 64, in the name of my noble friend Lady Stowell of Beeston, seek to require platforms to assess the risk of, and set terms for, content currently set out in Clause 12. Additionally, the amendments seek to place duties on services to assess risks to freedom of expression resulting from user empowerment tools. Category 1 platforms are already required to assess the impact on free expression of their safety policies, including user empowerment tools; to keep that assessment up to date; to publish it; and to demonstrate the positive steps they have taken in response to the impact assessment in a publicly available statement.

Amendments 48 and 100, in the name of the noble Lord, Lord Stevenson, seek to introduce a stand-alone duty on category 1 services to protect freedom of expression, with an accompanying code of practice. Amendments 49, 50, 53A, 61 and 156, in the name of the noble Baroness, Lady Fox, seek to amend the Bill’s Clause 17 and Clause 18 duties and clarify duties on content of democratic importance.

All in-scope services must already consider and implement safeguards for freedom of expression when fulfilling their duties. Category 1 services will need to be clear what content is acceptable on their services and how they will treat it, including when removing or restricting access to it, and that they will enforce the rules consistently. In setting these terms of service, they must adopt clear policies designed to protect journalistic and democratic content. That will ensure that the most important types of content benefit from additional protections while guarding against the arbitrary removal of any content. Users will be able to access effective appeal mechanisms if content is unfairly removed. That marks a considerable improvement on the status quo.

Requiring all user-to-user services to justify why they are removing or restricting each individual piece of content, as Amendment 53A would do, would be disproportionately burdensome on companies, particularly small and medium-sized ones. It would also duplicate some of the provisions I have previously outlined. Separately, as private entities, service providers have their own freedom of expression rights. This means that platforms are free to decide what content should or should not be on their website, within the bounds of the law. The Bill should not mandate providers to carry or to remove certain types of speech or content. Accordingly, we do not think it would be appropriate to require providers to ensure that free speech is not infringed, as suggested in Amendment 48.

Similarly, it would not be appropriate to require providers to give the same weight to protecting freedom of expression as to safety, as required under Amendment 61. Both amendments would, in effect, require platforms to carry legal content—even if they did not wish to—for safety, commercial or other reasons. This would likely result in worse outcomes for many users.

We have designed the regulatory framework to balance protecting user safety and freedom of expression. Platforms and Ofcom have duties relating to freedom of expression for which they can be held to account. A “must balance” test suggests there is a clear line to be drawn as to where legal content should be removed. This is in conflict with our policy, which accepts that it would be inappropriate for the Government to require companies to remove legal content accessed by adults. It also recognises that, as private entities, companies have the right to remove legal content from their services if they wish to do so. Preventing them from doing so by requiring them to balance this against other priorities could have unintended consequences.

Government Amendments 50A and 50F in my name seek to clarify that the size and capacity of the provider are important in construing the reference to proportionate systems and processes with regard to the duties on category 1 services to protect journalistic content and content of democratic importance. These amendments increase legal certainty and make the structure of these clauses consistent with other references to proportionality in the Bill. Without these amendments, it would be less clear which factors are important when construing whether a provider’s systems and processes to protect journalistic content and content of democratic importance are proportionate.

Amendment 51 in the name of the noble Lord, Lord Stevenson of Balmacara, seeks to change the duty of category 1 services to protect journalistic content so it applies only to journalism which they have judged to be in the public interest. This would delegate an inappropriate amount of power to platforms. Category 1 platforms are not in a position to decide what information is in the interests of the British public. Requiring them to do so would undermine why we introduced the Clause 15 duties—

Why would it not be possible for us to try to define what the public interest might be, and not leave it to the platforms to do so?

I ask the noble Viscount to bear with me. I will come on to this a bit later. I do not think it is for category 1 platforms to do so.

We have introduced Clause 15 to reduce the powers that the major technology companies have over what journalism is made available to UK users. Accordingly, Clause 15 requires category 1 providers to set clear terms of service which explain how they take the importance of journalistic content into account when making their moderation decisions. These duties will not stop platforms removing journalistic content. Platforms have the flexibility to set their own journalism policies, but they must enforce them consistently. They will not be able to remove journalistic content arbitrarily. This will ensure that platforms give all users of journalism due process when making content moderation decisions. Amendment 51 would mean that, where platforms subjectively reached a decision that journalism was not conducive to the public good, they would not have to give it due process. Platforms could continue to treat important journalistic content arbitrarily where they decided that this content was not in the public interest of the UK.

In his first remarks on this group the noble Lord, Lord Stevenson, engaged with the question of how companies will identify content of democratic importance, which is content that seeks to contribute to democratic political debate in the UK at a national and local level. It will be broad enough to cover all political debates, including grass-roots campaigns and smaller parties. While platforms will have some discretion about what their policies in this area are, the policies will need to ensure that platforms are balancing the importance of protecting democratic content with their safety duties. For example, platforms will need to consider whether the public interest in seeing some types of content outweighs the potential harm it could cause. This will require companies to set out in their terms of service how they will treat different types of content and the systems and processes they have in place to protect such content.

Amendments 57 and 62, in the name of my noble friend Lord Kamall, seek to impose new duties on companies to protect a broader range of users’ rights, as well as to pay particular attention to the freedom of expression of users with protected characteristics. As previously set out, services will have duties to safeguard the freedom of expression of all users, regardless of their characteristics. Moreover, UK providers have existing duties under the Equality Act 2010 not to discriminate against people with characteristics which are protected in that Act. Given the range of rights included in Amendment 57, it is not clear what this would require from service providers in practice, and their relevance to service providers would likely vary between different rights.

Amendment 60, in the name of the noble Lord, Lord Clement-Jones, and Amendment 88, in the name of the noble Lord, Lord Stevenson, probe whether references to privacy law in Clauses 18 and 28 include Article 8 of the European Convention on Human Rights. That convention applies to member states which are signatories. Article 8(1) requires signatories to ensure the right to respect for private and family life, home and correspondence, subject to limited derogations that must be in accordance with the law and necessary in a democratic society. The obligations flowing from Article 8 do not apply to individuals or to private companies and it would not make sense for these obligations to be applied in this way, given that states which are signatories will need to decide under Article 8(2) which restrictions on the Article 8(1) right they need to impose. It would not be appropriate or possible for private companies to make decisions on such restrictions.

Providers will, however, need to comply with all UK statutory and common-law provisions relating to privacy, and must therefore implement safeguards for user privacy when meeting their safety duties. More broadly, Ofcom is bound by the Human Rights Act 1998 and must therefore uphold Article 8 of the European Convention on Human Rights when implementing the Bill’s regime.

It is so complicated that the Minister is almost enticing me to stand up and ask about it. Let us just get that right: the reference to the Article 8 powers exists and applies to those bodies in the UK to which such equivalent legislation applies, so that ties us into Ofcom. Companies cannot be affected by it because it is a public duty, not a private duty, but am I then allowed to walk all the way around the circle? At the end, can Ofcom look back at the companies to establish whether, in Ofcom’s eyes, its requirements in relation to its obligations under Article 8 have or have not taken place? It is a sort of transparent, backward-reflecting view rather than a proactive proposition. That seems a complicated way of saying, “Why don’t you behave in accordance with Article 8?”

Yes, Ofcom, which is bound by it through the Human Rights Act 1998, can ask those questions and make that assessment of the companies, but it would not be right for private companies to be bound by something to which it is not appropriate for companies to be signatories. Ofcom will be looking at these questions but the duty rests on it, as bound by the Human Rights Act.

It is late at night and this is slightly tedious, but in the worst of all possible circumstances, Ofcom would be looking at what happened over the last year in relation to its codes of practice and assertions about a particular company. Ofcom is then in trouble because it has not discharged its Article 8 obligations, so who gets to exercise a whip on whom? Sorry, whips are probably the wrong things to use, but you see where I am coming from. All that is left is for the Secretary of State, but probably it would effectively be Parliament, to say to Ofcom, “You’ve failed”. That does not seem a very satisfactory solution.

Platforms will be guided by Ofcom in taking measures to comply with their duties which are recommended in Ofcom’s codes, and which contain safeguards for privacy, including ones based on the European Convention on Human Rights and the rights therein. Paragraph 10(2)(b) of Schedule 4 requires Ofcom to ensure that measures, which it describes in the code of practice, are designed in light of the importance of protecting the privacy of users. Clause 42(2) and (3) provides that platforms will be treated as complying with the privacy duties set out at Clause 18(2) and Clause 28(2), if they take the recommended measures that Ofcom sets out in the codes.

It worked. In seriousness, we will both consult the record and, if the noble Lord wants more, I am very happy to set it out in writing.

Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead, seeks to clarify that “freedom of expression” in Clause 18 refers to the

“freedom to impart ideas, opinions or information”,

as referred to in Article 10 of the European Convention on Human Rights. I think I too have been guilty of using the phrases “freedom of speech” and “freedom of expression” as though they were interchangeable. Freedom of expression, within the law, is intended to encompass all the freedom of expression rights arising from UK law, including under common law. The rights to freedom of expression under Article 10 of the European Convention on Human Rights include both the rights to impart ideas, opinions and information, but also the right to receive such ideas, opinions and information. Any revised definition of freedom of expression to be included in the Bill should refer to both aspects of the Article 10 definition, given the importance for both children and adults of receiving information via the internet. We recognise the importance of clarity in relation to the duties set out in Clauses 18 and 28, and we are very grateful to the noble and learned Lord for proposing this amendment, and for the experience he brings to bear on behalf of the Constitution Committee of your Lordships’ House. The Higher Education (Freedom of Speech) Bill and the Online Safety Bill serve very different purposes, but I am happy to say that the Bill team and I will consider this amendment closely between now and Report.

Amendments 101, 102, 109, 112, 116, 121, 191 and 220, in the name of my noble friend Lord Moylan, seek to require Ofcom to have special regard to the importance of protecting freedom of expression when exercising its enforcement duties, and when drafting or amending codes of practice or guidance. Ofcom must already ensure that it protects freedom of expression when overseeing the Bill, because it is bound by the Human Rights Act, as I say. It also has specific duties to ensure that it is clear about how it is protecting freedom of expression when exercising its duties, including when developing codes of practice.

My noble friend’s Amendment 294 seeks to remove “psychological” from the definition of harm in the Bill. It is worth being clear that the definition of harm is used in the Bill as part of the illegal and child safety duties. There is no definition of harm, psychological or otherwise, with regard to adults, given that the definition of content which is harmful to adults was removed from the Bill in another place. With regard to children, I agree with the points made by the noble Baroness, Lady Kidron. It is important that psychological harm is captured in the Bill’s child safety duties, given the significant impact that such content can have on young minds.

I invite my noble friend and others not to press their amendments in this group.

My Lords, your Lordships will want me to be brief, bearing in mind the time. I am very grateful for the support I received from my noble friends Lady Harding and Lady Fraser and the noble Baronesses, Lady Kidron and Lady Bull, for the amendments I tabled. I am particularly grateful to the noble Baroness, Lady Bull, for the detail she added to my description of the amendments. I can always rely on the noble Baroness to colour in my rather broad-brush approach to these sorts of things.

I am pleased that the noble Lord, Lord Stevenson, made his remarks at the beginning of the debate. That was very helpful in setting the context that followed. We have heard a basic theme come through from your Lordships: a lack of certainty that the Government have struck the right balance between privacy protection and freedom of expression. I never stop learning in your Lordships’ House. I was very pleased to learn from the new Milton—my noble friend Lord Moylan—that freedom of expression is a fundamental right. Therefore, the balance between that and the other things in the Bill needs to be considered in a way I had not thought of before.

What is clear is that there is a lack of confidence from all noble Lords—irrespective of the direction they are coming from in their contributions to this and earlier debates— either that the balance has been properly struck or that some of the clauses seeking to address freedom of speech in the Bill are doing so in a way that will deliver the outcome and overall purpose of this legislation as brought forward by the Government.

I will make a couple of other points. My noble friend Lord Moylan’s amendments about the power of Ofcom in this context were particularly interesting. I have some sympathy for what he was arguing. As I said earlier, the question of power and the distribution of it between the various parties involved in this new regime will be one we will look at in broad terms certainly in later groups.

On the amendments of the noble Lord, Lord Stevenson, on Clauses 13, 14 and so on and the protections and provisions for news media, I tend towards the position of my noble friend Lord Black, against what the noble Lord, Lord Stevenson, argued. As I said at the beginning, I am concerned about the censorship of our news organisations by the tech firms. But I also see his argument, and that of the noble Viscount, Lord Colville, that it is not just our traditional legacy media that provides quality journalism now—that is an important issue for us to address.

I am grateful to my noble friend the Minister for his round-up and concluding remarks. Although it is heartening to hear that he and the Bill team will consider the amendment from the noble and learned Lord, Lord Hope, in this group, we are looking—in the various debates today, for sure—for a little more responsiveness and willingness to consider movement by the Government on various matters. I hope that he is able to give us more encouraging signs of this, as we proceed through Committee and before we get to further discussions with him—I hope—outside the Chamber before Report. With that, I of course withdraw my amendment.

Amendment 46 withdrawn.

Amendments 47 and 48 not moved.

Clause 13: Duties to protect content of democratic importance

Amendments 49 and 50 not moved.

Amendment 50A

Moved by

50A: Clause 13, page 14, line 8, at end insert—

“(5A) In determining what is proportionate for the purposes of subsection (2), the size and capacity of the provider of a service, in particular, is relevant.”Member’s explanatory statement

This amendment indicates that the size and capacity of a provider is important in construing the reference to “proportionate systems and processes” in clause 13 (duties to protect content of democratic importance).

Amendment 50A agreed.

Clause 13, as amended, agreed.

House resumed.

House adjourned at 10.20 pm.