Skip to main content

Lords Chamber

Volume 830: debated on Tuesday 16 May 2023

House of Lords

Tuesday 16 May 2023

Prayers—read by the Lord Bishop of Gloucester.

Autism: Diagnosis Targets

Question

Asked by

To ask His Majesty’s Government what steps they are taking to ensure NHS targets for autism diagnosis are met.

My Lords, I beg leave to ask the Question standing in my name on the Order Paper, and I declare my interest as a vice-president of the National Autistic Society.

We recognise that not all areas are meeting the NICE recommended maximum of 13 weeks between a referral for an autism assessment and a first appointment. In 2022-23, we invested £2.5 million to test and improve autism diagnostic pathways. In 2023-24, there is a £4.2 million grant to improve services for autistic children and young people. In April, NHS England published a national framework and operational guidance to help the NHS and local authorities improve autism assessment services.

My Lords, by next year 190,000 patients are expected to be waiting for an autism diagnosis—it is already 130,000, and 67,000 of them have been waiting for more than a year. Research shows that there is a widening gap between the number of people who need to be seen and the number of staff available. In the last four years, we have managed to recruit just 19% more staff. The letter that the Minister helpfully sent us in April indicates that he is as concerned about this matter as any of us in this House. We appreciate that, but my question is simple: what is being done to recruit more staff?

I thank the noble Lord, both for his question and for his interest and work in this space. The House will know that this topic is quite close to my heart as well. It is an area of challenge. We have more demand than ever. We are committed to recruiting more staff. We have a recruitment target for next year of 27,000. Very promisingly—I hope I will have time to go into this in more detail later, or I will speak to the noble Lord afterwards—there is a pilot scheme in Bradford looking at children’s early years scoring and how that can be used as a precursor to screening and testing.

My Lords, I too declare an interest as a vice-president of the National Autistic Society—I am always pleased to work alongside my colleague, the noble Lord, Lord Touhig, on these matters. Very often, parents, in desperation, particularly want an autism assessment when their teenagers get to the stage where they are leaving school and going on to further education or other types of study. Without that assessment, no decisions can be made. We have many excellent centres around this country, particularly places such as the Lorna Wing Centre, where assessments can be made. Is it not time that the Government outsourced some of this, as long as the NICE guidelines are followed in giving that assessment, to ensure that the list that the noble Lord announced to the House is reduced much more rapidly than is happening at the moment?

Yes, absolutely, we need to look at all areas where we can increase and expand supply, including use of the private sector. I am sure I will be asked about ADHD later on and the “Panorama” programme, which shows that there are some pitfalls in all that, but provided they are assessing according to the NICE guidelines, it clearly has to be sensible to use as much supply as possible.

My Lords, would the Minister agree that when you delay an assessment, you delay support from the entire structure of government, which we have said should be helping? What help is his department getting from the Department for Education and the Department for Work and Pensions to ensure people are getting to these assessments? If they cannot get the full assessment, can some intermediate steps be taken to ensure that people actually get the help they are entitled to?

We are working closely with the Department for Education. The Bradford pilot scheme I mentioned takes the early years foundation stage profile scores of children. It knows that if you have a low score, you are far more likely to have autism. That triggers a multidisciplinary team to come in and inspect. That is a way that we can use that as an early warning indicator and then follow it up with volume. I hope that working very closely with the DfE in this space will be a real way forward.

My Lords, the Minister has already anticipated receiving this question—I would not want to disappoint him. He is clearly aware that there is some question over the reliability of some diagnoses that are being offered, particularly in the private sector, for ADHD, which is another neurodevelopmental disorder. Is he confident that, in trying to scale up the availability of diagnosis, which is obviously an admirable aspiration, the quality of those diagnoses will be maintained? Are the NICE guidelines sufficiently robust to ensure that?

As we all know, it is a complex area, and there is no black and white diagnosis of autism. The noble Baroness’s point is absolutely correct: we need to make sure that the quality is there. The Bradford pilot has now been running in 100 locations. Every child has to get an early years profile score. If we can show the linkages and follow that up with the screening programme, that will be very promising; but, absolutely, we have to make sure that the right assessment is made.

My noble friend the Minister has rightly said that it is important to expand supply and work with the private sector. Can he tell me about the work that the department is doing with civil society organisations and charities in expanding supply?

Yes, when I talk about supply, it is in all these fields. There are organisations of which I have personal experience, including the National Autistic Society, which does tireless work and has helped me out personally. So I know just how good they are in this situation. Absolutely, the whole strategy in this space is to expand supply by both the private sector and the independent and charity sectors.

My Lords, on autistic children, do the Government keep separate facts and figures for minority communities? I have encountered quite a few ethnic minority children who are suffering from autism. Are there separate facts and figures anywhere for these children?

Clearly, we pull together all the numbers. Typically, about 2.9% of children and young people are diagnosed with autism. I do not know whether that is different among ethnic minorities. I will happily research that and write to the noble Lord.

My Lords, may I ask my noble friend about artificial intelligence—AI? It is going to have a transformational impact on our National Health Service, for good, or possibly for ill. It will transform diagnosis, treatment, outcomes and—who knows?—it may even help us to make appointments more effectively. Of course, it will have an impact on those who work in the National Health Service as well as those who are treated by it. Have the Government started getting to grips with analysing what lies ahead with artificial intelligence? If not, I encourage them to do so very quickly because I believe that the impact of this will come much more rapidly than we might perhaps think at the moment.

First, I totally agree with my noble friend’s sentiment about the power that AI, when done in the right way, can have in this space. Clearly, the stress is on the words “the right way”. I think it is fair to say that we are all on the nursery slopes as regards what it can do. I have seen how effective it can be in taking doctors’ notes, recording a meeting and drafting action points, which a doctor can then review. I am sure that we would all agree that that is very promising. There are future generations of AI being talked about that may be able to perform diagnosis. In the 10 to 15 years of looking ahead in the long-term workforce plan, these are some of the things that we will have to try to take into account. However, we are in the very early stages.

My Lords, when it comes to autism services, we know that there are major disparities across the country which predate the pandemic but which were made much worse by it. The number of people waiting for an assessment has grown by 169% from pre-pandemic levels. How will the Minister ensure that the national framework and the standards for autism assessment within it are deliverable at a local level and in every part of the country?

First, each ICB now has to have a lead for autism and learning difficulties. The noble Baroness is correct that there are some disparities— I am sure that she is aware of the two ICBs which have restricted their services quite significantly, although, thankfully, they are now rowing back on that. We need to make sure that we are on top of all of them. As the noble Baroness is aware, I and other Ministers are taking a personal interest in this. Clearly, there is a lot of work to be done.

My Lords, what are the Government doing? Are they supporting research to find out the causes of this apparent huge increase in autism which we have seen in recent years?

There are a couple of factors. Obviously, the strains and stresses of Covid have brought a lot of these things out into the open. It is good that people are becoming much more aware. My experience dates back 20 years when no one had even really heard of Asperger’s, so it is good that we are aware of it today. It is also good that many more people are now diagnosed with it.

Whistleblowing Framework

Question

Asked by

To ask His Majesty’s Government whether the whistleblowing framework will include an assessment of the desirability of setting up an independent Office of the Whistleblower to deliver its objectives.

My Lords, the Government recognise how valuable it is that whistleblowers are prepared to shine a light on wrongdoing and believe that they should be able to do so without fear of recriminations. The Government launched a review of the whistleblowing framework on 27 March this year. This will examine the effectiveness of the existing framework in meeting its intended objectives, which are to enable workers to come forward and speak up about wrongdoing and to protect those who do so against detriment and dismissal. The review will provide an up-to-date evidence base on whistleblowing.

My Lords, the APPG for Whistleblowing, many Members of both Houses, scores of whistleblowers, significant legal counsel involved with whistleblowing, and even regulators in their evidence to the APPG have called for an office of the whistleblower. Will this review give full consideration to such an office —yes or no? If not, why not?

My Lords, the review will gather and examine evidence on the effectiveness of the GB whistleblowing framework in meeting its objectives. The framework was introduced way back in 1996, updated in 1998 and appended to since. I do not believe that an office of the whistleblower is part of this review. Having said that, once the review has been completed and has reported, that is clearly something that should be considered.

My Lords, while I appreciate my noble friend’s sentiments and objectives, do we really have to prostitute the English language in order to achieve them?

My Lords, I thank the Minister for his letter yesterday to those of us who spoke in the debate on the amendment from the noble Baroness, Lady Kramer, on this very issue on day 6 of the Economic Crime and Corporate Transparency Bill Grand Committee. It deals particularly with data reporting. That is very helpful, although it shows that the regulators implement this patchily, inconsistently and, arguably, uselessly. So is it too late for the terms of reference of the review to be revised specifically to include an office of the whistleblower but, more particularly, to look at whether we could financially reward whistleblowers for information? That has generated a dramatically better climate for whistleblowing in the United States of America. Could we also look at our experience from the pandemic? Given the poor recovery rate for the public funds misspent on PPE, surely this is worth trying.

The noble Lord has asked a lot of questions. The regulators and the level of consistency in reporting are absolutely part of the brief. The review is not currently structured to look at the question of a department of the whistleblower but, as I said in my Answer, I believe that that may well be a recommendation that comes out of it. I am afraid I cannot remember what the other question was.

My Lords, maybe I can help the Minister out on this point by following on from the question from the noble Lord, Lord Browne, and asking what consideration has been given to replicating what is done in the United States in offering financial incentives? Different levels of compensation are paid out commensurate with the quality of information on economic crimes and for the strengthening of sanctioning regimes overall.

My Lords, at the moment the Government do not think that financial incentives to whistleblow are an appropriate way to take this review or any future interest forward.

I congratulate the Government on having a review of whistleblowing, which clearly is long overdue. I thank my noble friend for his letter and engagement with us on the whistleblowing issue in the Economic Crime and Corporate Transparency Bill, but does he consider that there is adequate protection in the current framework against career detriment and dismissal for whistleblowers? Does he not think that those who are working inside firms are best placed to blow the whistle and uncover crimes before any regulator tries to sweep up the mess afterwards? Therefore, looking at examples overseas, such as in America, that seem to work much better than here might be worth considering.

My Lords, I think I have answered the question about the American system. Having said that, we will of course look at what is current practice and best practice overseas to see how we can take this whole process forward. Surely what we are trying to do is to come up with a world-class whistleblowing framework and structure that protects workers who come forward and risk their employment and, to some extent, their financial future in calling out this potential fraud.

My Lords, my direct experience of whistleblowers is that doing that actually devastates their life, their income and their circle of friends. They have a very tough time, particularly if they come from the police or the military. I am slightly concerned that the Minister almost seemed to rule out an office of the whistleblower, because it would be wrong to have a consultation and not look at all the options. It seems to me that we need something very drastic, because the current legislation is no protection at all.

My Lords, I quite agree that taking any form of case to an employment tribunal is risky and difficult for the individuals involved, and it is no less so in a whistleblowing case. When the review is done and we are able to take a look at what the recommendations are—I think I have said this twice already—it will be very surprising if the question of an office of the whistleblower did not come up.

My Lords, evidence suggests that the UK’s law on whistleblowing is falling behind international best practice and has not kept pace with the modern workplace or the scale of the problem. We welcome the Government’s announcement of the establishment of the review into the whistleblowing legislation, but perhaps the Minister could tell us when this will report. In the light of that, would he not agree that the Economic Crime and Corporate Transparency Bill, shortly going to Report, provides an ideal opportunity to act now to bring in immediate, obvious improvements? Further, does he agree that requiring all employers to introduce effective internal arrangements would be a beneficial step forward?

I absolutely agree that the current whistleblowing framework is not current best practice. This was basically put in place in 1996 and 1998, and that is one of the main purposes for the review being carried out now. The review is due to complete its work in October this year and will report very soon after that. Hopefully, from that, we can get to a situation where we are able to move forward to something that really is world-class.

My Lords, does the Minister have any information on the percentage of whistleblowing claims that are false?

State Pension Underpayment Errors

Question

Asked by

To ask His Majesty’s Government what steps they are taking to tackle under-payment errors in state pensions.

My Lords, the Government are fully committed to ensuring that state pension error is put right as quickly as possible. More than 1,300 staff have been recruited or redeployed to the ongoing state pension underpayment correction exercise, with case reviews expected to significantly increase this year. This is an issue that dates back many years, and we are working hard to correct these historic errors and to ensure that they do not happen again.

My Lords, I thank the noble Viscount for his reply and I know that he takes the issue seriously. However, it is notable that the figures published last week by the Office for National Statistics showed that the main cause of underpayment was what it termed “official error”, and in the last financial year, the underpayments totalled £580 million—£50 million more than in the previous year. It is getting worse. I note what the Minister says about additional staff, but it is clear that more needs to be done.

The noble Lord is right. We know that 700,000 cases require review; an estimated 230,000 customers will be affected. In terms of what we have actually done, 173,538 cases have been reviewed; 46,760 underpayments have been identified, and just over £300 million was paid in arrears. As for the reasons that were highlighted by the noble Lord, they are multifarious. One is that DWP staff sometimes fail to manually set an action system prompt on state pension accounts to review payments, such as reaching an 80th birthday.

Is my noble friend aware of the beneficence of his department in that those who have reached their fourscore years get a huge 25p a week supplement, which, to the best of my knowledge, has never been reviewed since 1971? Is this good value for money?

I take note of what my noble friend has said. It is interesting to note that we are talking about an overpayment rather than an underpayment. Far from me to authorise taking away 25p from my noble friend, despite the fact that I am a Scotsman.

My Lords, the department has said that the current large-scale correction for those cheated of their full entitlement should be completed by the end of 2024, but in its most recent annual report it admitted a different error, relating to home responsibilities protection, where thousands of mothers are missing out on NI credits. Can the Minister assure the House that the department will not wait until the end of the current correction exercise before it starts on this new category of cheated errors?

The noble Lord makes a good point about home responsibilities protection, which is one of the issues that we are looking at in a timely fashion. We will be providing estimates and next steps for corrective action in the summer. Obviously, we are looking to move at pace to resolve these issues.

My Lords, the noble Viscount’s Written Statement last week celebrated the fall in fraud and overpayment error in the social security system as a whole, but it rather glossed over the increase in underpayment to £3.3 billion. That is money which is not going into the pockets of people who need it. Do the Government not think that under- payments are as important as overpayments, and what are they doing to minimise underpayments?

Of course they are important. Any underpayment is incredibly important, as I am sure the noble Baroness would agree. The department became aware of issues with state pension underpayments in 2020 and, as mentioned earlier, the issues go back several decades and through different Governments. We have taken immediate action to investigate the extent of the problem and are carrying out highly complex scans of computer systems. Correction activity commenced on 11 January 2021; I say again that this is an important matter and we are moving at pace.

My Lords, as the exercise is focusing on women, and women’s state pensions are still noticeably lower than those of men, are those who are entitled to arrears also entitled to some kind of compensation or consolation payment? How is my noble friend’s department prioritising the work?

I note what my noble friend says about the gender disparities, which we are alert to. Indeed, the department has a discretionary scheme which allows special payments to be made to customers to address any hardship, but particularly injustice caused by DWP maladministration. Consistent with other large-scale LEAP exercises, special payments under the DWP discretionary scheme will not, however, routinely be made, but I assure the House that they are regarded or assessed on a case-by-case basis. Finally, on prioritising, it is important to note that we are prioritising those who are alive over those who are deceased.

My Lords, I am one of those women who were underpaid. For years, I got £6 a week—I was very exercised over how to spend it—whereas many of my women friends who had never worked at all were getting much more than that. With expert advice, I was able to access the department and it was set right, but it seemed to me that the problem was how to access the department. Once it had the issue in hand it responded, but people need to know the email addresses and there need to be pamphlets in post offices. There need to be easy ways for older people to speak to someone in the department and get an answer when they write—without, of course, having to hold on to the phone for ages. Will the Minister ensure that that happens?

Indeed, and it is very important that we engage much more closely with the customer base. Where underpayments are identified, the DWP will contact the individual to inform them of any changes to their state pension amount and of any arrears involved. There is now, I am pleased to say, a more direct route for those inquiring about underpaid state pension. Guidance on this, the House may not be surprised to hear, is on GOV.UK and went live in July last year.

My Lords, these cases are very urgent for some people; 25p may be an issue for the over-80s, but in just January and February 14,500 over-80s were found to have been underpaid—out of a total of 46,000 underpayments. The worst affected were those who had been widowed, who were underpaid by, on average, £11,500. We all know how quickly the DWP will go after you if you get overpaid, so can the Minister assure us of two things? First, is priority being given to those who most need the money and who, frankly, may need it rather more urgently for reasons such as more advanced age? Secondly, the NAO suggested in its very damning report that the department assess all underpayments to see whether there is a systemic cause which might affect other cases. Is that now being done?

Very much so; it is being done. I think I alluded to this earlier. Any systemic problem has to be looked at as a matter of urgency. On the other question the noble Baroness raised, I mentioned the number of extra people we have put on to this particular case. I reassure her and the House that the data shows that we have reviewed an average of more than 15,000 cases per month between November 2022 and February 2023, compared with an average of only 5,000 per month over the first 22 months of the exercise.

My lords, it is estimated that between 20,000 and 25,000 pensioners die each year because of low income and the hard choices they have to make between heating and eating. Can the Minister explain whether any assessment has been made of the deaths and hardship caused by underpayment of state pension over the last 13 years?

No, I do not have any figures to support the argument that the noble Lord is proffering. What I can say is that we very much take note of wanting to support the most vulnerable. We have increased benefits in line with the September 2022 consumer prices index of 10.1%, including around 12 million pensions.

My Lords, is the Minister aware that, due to their incompetence, the Scottish Government have underspent their budget in the last financial year by £2 billion, which could have been spent helping carers and others? Will the Minister confirm that this money will now go back to the Treasury and not be spent helping poor people in Scotland?

Allow me to look into that and provide an answer to the noble Lord. I think it is normally the case that the money goes back to the Treasury but, without knowing here, I do not want to stick my neck out on it.

My Lords, could the Minister indicate what corrective action will be taken to address the needs of the WASPI women, who have been underpaid for many years and are not entitled to their pensions from the age of 60?

I am aware, as we all are, of the WASPI issue. The noble Baroness will be aware of the judicial review against the PHSO. We are aware of it, but I am unable to comment because of the judicial review.

Carers: Financial Support

Question

Asked by

To ask His Majesty’s Government whether they plan to review the financial support available to unpaid carers following new research by Carers UK and the University of Sheffield which found that they contribute £445 million daily to the economy in England and Wales.

My Lords, the Government recognise the vital role played by millions of unpaid carers across the country. We are already providing them with record amounts of financial support through the benefits system, including nearly £3.5 billion per year to around 1 million carers through the carer’s allowance alone. Those unpaid carers in lower-income households could also receive an additional £2,200 per year through the universal credit carer element.

My Lords, I was only asking for a review; it seems a modest enough request in view of the £445 million contributed every day by unpaid carers. May I ask the Minister something very specific which, if there should be a review, he would be able to consider? The earnings limit for carer’s allowance is not rising as quickly as the national living wage. The number of hours carers are allowed to work will reduce from 14 to 13 before they lose their entitlement to the benefit. This means that carers are very limited in their ability to undertake paid work and combine it with their caring, which many of them wish to do. Does the Minister agree that deterring carers from working is really not sensible, and that the earnings limit should be increased to a minimum of 21 hours at national living wage rates?

I know the noble Baroness has much experience in this particular area. On the carer’s allowance, I can reassure her that we continue to review the limit and make changes where we feel they are warranted and affordable. The carer’s allowance has an earnings limit, which she alluded to, which permits carers to undertake some part-time work; it also recognises the benefits of staying in touch with the workplace, which we regard as important, including providing greater financial independence and social interaction. As the noble Baroness will know, it can be extremely lonely and very hard work being a carer, as the hours are often long and the work very demanding.

My Lords, a place in a local authority care home will cost a local authority a minimum of £800 a week, which is over £700 more than is paid to a carer who cares for more than 35 hours a week, as carer’s allowance is only £76.75. Does the Minister agree that the Government and local taxpayers are getting a very good deal on the backs of unpaid carers?

We definitely want to applaud the huge number of unpaid carers who work in our society. Caring for a family member or friend, as we know, can be enormously hard work but it can also be incredibly rewarding. To pick up on the noble Baroness’s point, means testing comes into this and this can increase weekly income and act as a passport to other support, including help with fuel costs through schemes such as the warm home discount and cold weather payments, and more recently payments to help with increases in the cost of living.

My Lords, as an officer of the APPG on 22q—a genetic syndrome that is half as prevalent as Down’s syndrome, with similarly far-reaching effects—I know carers who are parents of disabled children who can suddenly find that they have to be in hospital with their child for several days. They also attend far more medical appointments than normal. Do the Government perceive a need to encourage and enable employers to show greater flexibility in these unavoidable circumstances, and how might they do that?

My noble friend makes a very good point. As I said earlier, we are committed to supporting unpaid carers to balance the care they may give alongside work, if they are able to do so. Some caring responsibilities are extremely demanding. My noble friend may know that the Carer’s Leave Bill is currently going through Parliament. This will introduce a new leave entitlement as a right from day one to those being employed, available to all employees who are providing care to a dependant with a long-term care or support need.

My Lords, we have received the Government’s response to the report from the Adult Social Care Committee and we are grateful for it. Is the Minister aware that in that report we recommended that carer’s allowance be reviewed in the next year? We recommended that the threshold for the hours of caring be reduced so that people could access carer’s allowance more easily, and that the allowance be uprated in line with the national living wage. All those recommendations have been rejected. Can the noble Viscount tell me why?

First, I wish the noble Baroness a very happy birthday. Moving on swiftly to her question, I very much note the points the report has produced; I read it over the weekend and it makes some important points. I said earlier how much we value the role of unpaid carers. Yes, the rate of carer’s allowance is £76.75 a week. The total caseload is 1.4 million and I think it very important indeed that we continue to review the role of carers and the carer’s allowance, but, as I mentioned earlier, there is a means-tested element here and top-ups are available for those in need.

My Lords, in addition to the issue of financial support for unpaid adult carers, we must not forget the contribution of young carers, who provide invaluable support to their families. What are the Government doing to ensure financial support for respite support, as well as access to a young carer’s lead in their school or college, as is currently available in Gloucestershire?

The right reverend Prelate makes a very good point, and that is certainly an element of what we are doing and looking at. As I said, the main point is that we very much recognise the importance of carers and their work. Indeed, Carers Week runs from 5 to 11 June this year. On respite care, the right reverend Prelate makes an important point.

My Lords, there are many young carers between 16 and 25 in full-time education—around 375,000—but they seem to get a particularly raw deal, in that they are not eligible for any state financial support and have to look to charities. Will the Minister take a look at their predicament?

I am very aware that some carers are extremely young, and I say again that I recognise the role of unpaid carers. The carer’s allowance is not intended to be a replacement for a wage or a payment for caring services, so we cannot compare it to the national minimum wage or the national living wage, for example. The noble Baroness raises another important point that we should continue to look at.

My Lords, universal credit is a replacement for a wage, and there are people on it who can work only part time because of the need to care for a loved one, and, in some cases, because they simply cannot get hold of formal social care any more —things are pretty tough at the moment. They are not automatically excluded from the requirement to look for full-time work while on universal credit, so what guidance is given to universal credit work coaches in those circumstances?

The guidance is continually updated for them. The noble Baroness will be aware of the link between the carer’s allowance and the universal credit tapering system, so that, if tapering is involved, you receive 55p for every £1.

My Lords, I am glad to hear that the Government want to support unpaid carers, but one of the problems is that they are invisible to all systems, whether that is health and care services, benefits or other government departments. So what are the Government doing to ensure that we identify carers who have this important role?

Again, there is more to be done to highlight the enormous amount of tremendous work that carers do. We are working on this, particularly in tandem with our colleagues in DHSC. I have certainly noted this and will take it forward. If there is something that I can write to my noble friend with, I will do so.

My Lords, the whole House will recognise that, at any time, the whole lifestyle of any of us could be changed by a dramatic illness of a close relative. As indicated, the position of unpaid carers is largely not recognised or sometimes ignored, so that, when they are concerned about their relative and get in touch with one of the agencies, they are often disregarded because they are not the patient, and their views are not sought, even though they are providing a huge amount of care. Can the noble Viscount assure us that everything is being done to improve the recognition of unpaid carers’ contribution?

Absolutely, and this ties in with my noble friend’s question. I reassure both the noble Lord and my noble friend that we are improving the recognition, identification and involvement of unpaid carers, particularly in local areas. There are new duties in the Health and Care Act 2022 around involving carers, including in hospital discharge, and new guidance has been prepared for the integrated care strategies, as well as new SCIE guidance for commissioners on breaks for adult carers.

Online Safety Bill

Committee (7th Day)

Relevant document: 28th Report from the Delegated Powers Committee

Amendment 56

Moved by

56: After Clause 17, insert the following new Clause—

“OFCOM reviews of complaints systems

(1) Within the period of one year beginning on the day on which this Act is passed, and annually thereafter, OFCOM must review the workings of the complaints systems set up by regulated companies under section 17 (duties about complaints procedures), as to—(a) their effectiveness;(b) their cost and efficiency; and(c) such other matters as seem appropriate.(2) In undertaking the reviews under subsection (1), OFCOM may take evidence from such bodies and individuals as it considers appropriate.(3) If OFCOM determines from the nature of the complaints being addressed, and the volumes of such complaints, that systems established under section 17 are not functioning as intended, it may establish an online safety ombudsman with the features outlined in subsections (4) to (8), with the costs of this service being met from the levy on regulated companies.(4) The purpose of the online safety ombudsman is to provide an impartial out-of-court procedure for the resolution of any dispute between—(a) a user of a regulated user-to-user service, or a nominated representative for that user, and(b) the regulated service provider,in cases where complaints made under processes which are compliant with section 17 have not, in the view of the user (or their representative), been adequately addressed.(5) The ombudsman must allow for a user (or their representative) who is a party to such a dispute to refer their case to the ombudsman if they are of the view that any feature or conduct of one or more provider of a regulated user-to-user service, which is relevant to that dispute, presents (or has presented) a material risk of—(a) significant or potential harm;(b) contravening a user’s rights, as set out in the Human Rights Act 1998, including freedom of expression; or(c) failure to uphold terms of service.(6) The ombudsman may make special provision for children, including (but not limited to) prioritisation of—(a) relevant provisions under the United Nations Convention on the Rights of the Child; or(b) a child’s physical, emotional or psychological state.(7) The ombudsman must have regard to the desirability of any dispute resolution service provided by the ombudsman being— (a) free;(b) easy to use, including (where relevant) taking into account the needs of vulnerable users and children;(c) effective and timely;(d) fair and flexible, taking into account different forms of technology and the unique needs of different types of user; and(e) transparent.(8) The Secretary of State must ensure that use of any dispute resolution service provided by the ombudsman does not affect the ability of a user (or their representative) to bring a claim in civil proceedings.”Member’s explanatory statement

This new Clause would require Ofcom to conduct regular reviews of the effectiveness of complaints procedures under Clause 17. If Ofcom were of the view that such procedures were not functioning effectively, they would be able to establish an online safety ombudsman with the features outlined in subsections (4) to (8) of the Clause.

My Lords, Amendment 56 proposes a pathway towards setting up an independent ombudsman for the social media space. It is in my name, and I am grateful to the noble Lord, Lord Clement-Jones, for his support. For reasons I will go into, my amendment is a rather transparent and blatant attempt to bridge a gap with the Government, who have a sceptical position on this issue, and I hope that the amendment in its present form will prove more attractive to them than our original proposal.

At the same time, the noble Baroness, Lady Newlove, has tabled an amendment on this issue, proposing an independent appeals mechanism

“to provide impartial out of court resolutions for individual users of regulated services”.

Given that this is almost exactly what I want to see in place—as was set out in my original amendment, which was subsequently rubbished by the Government—I have also signed the noble Baroness’s amendment, and I very much look forward to her speech. The Government have a choice.

The noble Baroness, Lady Fox, also has amendments in this group, although they are pointing in a slightly different direction. I will not speak to them at this point in the proceedings, although I make it absolutely clear that, while I look forward to hearing her arguments —she is always very persuasive—I support the Bill’s current proposals on super-complaints.

Returning to the question of why we think the Bill should make provision for an independent complaints system or ombudsman, I suppose that, logically, we ought first to hear the noble Baroness, Lady Newlove, then listen to the Government’s response, which presumably will be negative. My compromise amendment could then be considered and, I hope, win the day with support from all around the Committee—in my dreams.

We have heard the Government’s arguments already. As the Minister said in his introduction to the Second Reading debate all those months ago on 1 February 2023, he was unsympathetic. At that time, he said:

“Ombudsman services in other sectors are expensive, often underused and primarily relate to complaints which result in financial compensation. We find it difficult to envisage how an ombudsman service could function in this area, where user complaints are likely to be complex and, in many cases, do not have the impetus of financial compensation behind them”.—[Official Report, 1/2/23; col. 690.]

Talk about getting your retaliation in first.

My proposal is based on the Joint Committee’s unanimous recommendation:

“The role of the Online Safety Ombudsman should be created to consider complaints about actions by higher risk service providers where either moderation or failure to address risks leads to … demonstrable harm (including to freedom of expression) and recourse to other routes of redress have not resulted in a resolution”.

The report goes on to say that there could

“be an option in the Bill to extend the remit of the Ombudsman to lower risk providers. In addition … the Ombudsman would as part of its role i) identify issues in individual companies and make recommendations to improve their complaint handling and ii) identify systemic industry wide issues and make recommendations on regulatory action needed to remedy them. The Ombudsman should have a duty to gather data and information and report it to Ofcom. It should be an ‘eligible entity’ to make super-complaints”

possible. It is a very complicated proposal. Noble Lords will understand from the way the proposal is framed that it would provide a back-up to the primary purpose of complaints, which must be to the individual company and the service it is providing. But it would be based on a way of learning from experience, which it would build up as time went on.

I am sure that the noble Lord, Lord Clement-Jones, will flesh out the Joint Committee’s thinking on this issue when he comes to speak, but I make the point that other countries preparing legislation on online safety are in fact building in independent complaints systems; we are an outlier on this. Australia, Canada and others have already legislated. Another very good example nearer to hand is in Ireland. We are very lucky to have with us today the noble Baroness, Lady Kidron, a member of the expert panel whose advice to the Irish Government to set up such a system in her excellent report in May 2022 has now been implemented. I hope that she will share her thoughts about these amendments later in the debate.

Returning to the Government’s reservations about including an ombudsman service in the Bill, I make the following points based on my proposals in Amendment 56. There need not be any immediate action. The amendment as currently specified requires Ofcom to review complaints systems set up by the companies under Clause 17 as to their effectiveness and efficiency. It asks Ofcom to take other evidence into account and then, and only then, to take the decision of whether to set up an ombudsman system. If there were no evidence of a need for such a service, it would not happen.

As for the other reservations raised by the Minister when he spoke at Second Reading, he said:

“Ombudsman services in other sectors are expensive”.

We agree, but we assume that this would be on a cost recovery model, as other Ofcom services are funded in that way. The primary focus will always be resolving complaints about actions or inactions of particular companies in the companies’ own redress systems, and Ofcom can always keep that under review.

He said that they are “often underused”. Since we do not know at the start what the overall burden will be, we think that the right solution is to build up slowly and let Ofcom decide. There are other reasons why it makes sense to prepare for such a service, and I will come to these in a minute.

He said that other ombudsman services

“primarily relate to complaints which result in financial compensation”.

That is true, but the evidence from other reports, and that we received in the Joint Committee, was that most complainants want non-financial solutions: they want egregious material taken down or to ensure that certain materials are not seen. They are not after the money. Where a company is failing to deliver on those issues in their own complaints system, to deny genuine complainants an appeal to an independent body seems perverse and not in accordance with natural justice.

He said that

“user complaints are likely to be complex”.—[Official Report, 1/2/23; col. 690.]

Yes, they probably are, but that seems to be an argument for an independent appeals body, not against it.

To conclude, we agree that Ofcom should not be the ombudsman and that the right approach is for Ofcom to set up the system as and when it judges that it would be appropriate. We do not want Ofcom to be swamped with complaints from users of regulated services, who, for whatever reason, have not been satisfied by the response of the individual companies or to complex cases, or seek system-wide solutions. But Ofcom needs to know what is happening on the ground, across the sector, as well as in each of the regulated companies, and it needs to be kept aware of how the system as a whole is performing. The relationship between the FCA and the Financial Ombudsman Service is a good model here. Indeed, the fact that some of the responsibilities to be given to Ofcom in the Bill will give rise to complaints to the FOS suggests that there would be good sense in aligning these services right from the start.

We understand that the experience from Australia is that the existence of an independent complaints function can strengthen the regulatory functions. There is also evidence that the very existence of an independent complaints mechanism can provide reassurances to users that their online safety is being properly supported. I beg to move.

My Lords, this is the first time that I have spoken in Committee. I know we have 10 days, but it seems that we will go even further because this is so important. I will speak to Amendments 250A and 250B.

I thank the noble Lords, Lord Russell of Liverpool and Lord Stevenson of Balmacara, and, of course— if I may be permitted to say so—the amazing noble Baroness, Lady Kidron, who is an absolute whizz on this, for placing their names on these amendments, as well as the 5Rights Foundation, the Internet Watch Foundation and the UK Safer Internet Centre for their excellent briefings. I have spoken to these charities, and the work they do is truly amazing. I do not think that the Bill will recognise just how much time and energy they give to support families and individuals. Put quite simply, we can agree that services’ internal complaint mechanisms are failing.

Let me tell your Lordships about Harry. Harry is an autistic teenager who was filmed by a member of the public in a local fast-food establishment when he was dysregulated and engaging in aggressive behaviour. This footage was shared out of context across social media, with much of the response online labelling Harry as a disruptive teenager who was engaging in unacceptable aggression and vandalising public property. This was shared thousands of times over the course of a few weeks. When Harry and his mum reported it to the social media platforms, they were informed that it did not violate community guidelines and that there was a public interest in the footage remaining online. The family, quite rightly, felt powerless. Harry became overwhelmed at the negative response to the footage and the comments made about his behaviour. He became withdrawn and stopped engaging. He then tried to take his own life.

It was at this point that Harry’s mum reached out to the voluntary-run service Report Harmful Content, as she had nowhere else to turn. Report Harmful Content is run by the charity South West Grid for Learning. It was able to mediate between the social media sites involved to further explain the context and demonstrate the real-world harm that this footage, by remaining online, was having on the family and on Harry’s mental health. Only then did the social media companies concerned remove the content.

Sadly, Harry’s story is not an exception. In 2022, where a platform initially refused to take down content, Report Harmful Content successfully arbitrated the removal of content in 87% of cases, thus demonstrating that even if the content did not violate community guidelines, it was clear that harm had been done. There are countless cases of members of the public reporting a failure to remove content that was bullying them. This culture of inaction has led to apathy and a disbelief among users that their appeals will ever be redressed. Research published by the Children’s Commissioner for England found that 40% of children did not report harmful content because they felt that there

“was no point in doing so”.

The complaints mechanism in the video-sharing platform regulation regime is being repealed without an alternative mechanism to fill the gap. The current video-sharing platform regulation requires platforms to

“provide for an impartial out-of-court procedure for the resolution of any dispute between a person using the service and the provider”

to operate impartial dispute resolution in the event. In its review of the first year of this regulation, Ofcom highlighted that the requirements imposed on platforms in scope are not being met in full currently. However, instead of strengthening existing appeals processes, the VSP regime is set to be repealed and superseded by this Bill.

The Online Safety Bill does not have an individual appeals process, meaning that individuals will be left without an adequate pathway to redress. The Bill establishes only a “super-complaints” process for issues concerning multiple cases or cases highlighting a systemic risk. It will ultimately fall to the third sector to highlight cases to Ofcom on behalf of individuals.

The removal of an appeals process—given the repeal of the VSP regime—would be in stark contrast with the direction of travel in other nations. Independent appeals processes exist in Australia and New Zealand, and more countries are also looking at adopting independent appeals. The new Irish Online Safety and Media Regulation Act includes provision

“for the making of a complaint to the Commission”.

The Digital Services Act in Europe also puts a process in place. There is precedent for these systems. It cannot be right that the Republic of Ireland and the UK and its territories have over 52 ombudsmen in 32 sectors, yet none of them works in digital at a time when online harm—especially to children, as we hear time and again in your Lordships’ House—is at unprecedented levels.

The Government’s response so far has been insufficient. When the Online Safety Bill received its seventh sitting debate, much discussion related to independent appeals, referred to here as the need for an ombudsman. The Digital Minister recognised:

“In some parts of our economy, we have ombudsmen who deal with individual complaints, financial services being an obvious example. The Committee has asked the question, why no ombudsman here? The answer, in essence, is a matter of scale and of how we can best fix the issue. The volume of individual complaints generated about social media platforms is just vast”.

Dame Maria Miller MP said that

“it is not a good argument to say that this is such an enormous problem that we cannot have a process in place to deal with it”.—[Official Report, Commons, Online Safety Bill Committee, 9/6/22; col. 295-96.]

In response to the Joint Committee’s recommendation for an ombudsman, the Government said:

“An independent resolution mechanism such as an Ombudsman is relatively untested in areas of non-financial harm. Therefore, it is difficult to know how an Ombudsman service could function where user complaints are likely to be complex and where financial compensation is not usually appropriate. An Ombudsman service may also disincentivise services from taking responsibility for their users’ safety. Introducing an independent resolution mechanism at the same time as the new regime may also pose a disproportionate regulatory burden for services and confuse users … The Secretary of State will be able to reconsider whether independent resolution mechanisms are appropriate at the statutory review. Users will also already have a right of action in court if their content is removed by a service provider in breach of the terms and conditions. We will be requiring services to specifically state this right of action clearly in their terms and conditions”.

Delaying the implementation of an individual appeals process will simply increase the backlog of cases and will allow for the ripple effect of harm to go unreported, unaddressed and unaccounted for.

There is precedent for individual complaints systems, as I have mentioned, both in the UK and abroad. Frankly, the idea that an individual complaints process will disincentivise companies from taking responsibility does not hold weight, given that these companies’ current appeal mechanisms are woefully inadequate. Users must not be left to the courts to have their appeals addressed. This process is cost-prohibitive for most and cannot be the only pathway to justice for victims, especially children.

To conclude, I have always personally vowed to speak up for those who endure horrific suffering and injustices from tormentors. I know how the pain and trauma that comes from systems that have been set up being no longer being fit for purpose feels. I therefore say this to my noble friend the Minister: nothing is too difficult if you really want to find a solution. The public have asked for this measure and there is certainly wide precedent for it. By not allowing individuals an appeals process, the Government’s silence simply encourages the tormentors and leaves the tormented alone.

My Lords, I will speak in support of Amendments 250A and 250B; I am not in favour of Amendment 56, which is the compromise amendment. I thank the noble Baroness, Lady Newlove, for setting out the reasons for her amendments in such a graphic form. I declare an interest as a member of the Expert Group on an Individual Complaints Mechanism for the Government of Ireland.

The day a child or parent in the UK has a problem with an online service and realises that they have nowhere to turn is the day that the online safety regime will be judged to have failed in the eyes of the public. Independent redress is a key plank of any regulatory system. Ombudsmen and independent complaint systems are available across all sectors, from finance and health to utilities and beyond. As the noble Lord, Lord Stevenson, set out, they are part of all the tech regulation that has been, or is in the process of being, introduced around the world.

I apologise in advance if the Minister is minded to agree to the amendment, but given that, so far, the Government have conceded to a single word in a full six days in Committee, I dare to anticipate that that is not the case and suggest three things that he may say against the amendment: first, that any complaints system will be overwhelmed; secondly, that it will offer a get-out clause for companies from putting their own robust systems in place; and, thirdly, that it will be too expensive.

The expert group of which I was a member looked very carefully at each of these questions and, after taking evidence from all around the globe, it concluded that the system need not be overwhelmed if it had the power to set clear priorities. In the case of Ireland, those priorities were complaints that might result in real-world violence and complaints from or on behalf of children. The expert group also determined that the individual complaints system should be

“afforded the discretion to handle and conclude complaints in the manner it deems most appropriate and is not unduly compelled toward or statutorily proscribed to certain courses of action in the Bill”.

For example, there was a lot of discussion on whether it could decide not to deal with copycat letters, treat multiple complaints on the same or similar issue as one, and so on.

Also, from evidence submitted during our deliberations, it became clear that many complainants have little idea of the law and that many complaints should be referred to other authorities, so among the accepted recommendations was that the individual complaints system should be

“provided with a robust legal basis for transferring or copying complaints to other bodies as part of the triage process”—

for example, to the data regulator, police, social services and other public bodies. The expert group concluded that this would actually result in better enforcement and compliance in the ecosystem overall.

On the point that the individual complaints mechanism may have the unintended consequence of making regulated services lazy, the expert group—which, incidentally, comprised a broad group of specialisms such as ombudsmen, regulators and legal counsel among others—concluded that it was important for the regulator to set a stringent report and redress code of practice for regulated companies so that it was not possible for any company to just sit back until people were so fed up that they went to the complaints body. The expert group specifically said in its report that it

“is acutely aware of the risk of … the Media Commission … drawing criticism for the failings of the regulated entities to adequately comply with systemic rules. In this regard, an individual complaints mechanism should not be viewed as a replacement for the online platforms’ complaint handling processes”.

Indeed, the group felt that an individual complaints system complemented the powers given to the regulator, which could and should take enforcement against those companies that persistently fail to introduce an adequate complaints system—not least because the flow of complaints would act as an early warning system of emerging harms, which is of course one of the regulator’s duties under the Bill.

When replying to a question from the noble Lord, Lord Knight of Weymouth, last week about funding digital literacy, the Minister made it clear that the online safety regime would be self-financing via the levy. In which case, it does not seem to be out of proportion to have a focused and lean system in which the urgent, the vulnerable and the poorly served have somewhere to turn.

The expert group’s recommendation was accepted in full by Ireland’s Minister for Media, Culture and Tourism, Catherine Martin, who said she would

“always take the side of the most vulnerable”

and the complaint system would deal with people who had

“exhausted the complaints handling procedures by any online services”.

I have had the pleasure of talking to its new leadership in recent weeks, and it is expected to be open for business in 2024.

I set that out at length just to prove that it is possible. It was one of the strong recommendations of the pre-legislative committee, and had considerable support in the other place, as we have heard. I think both Ofcom and DSIT should be aware that many media outlets have not yet clocked that this complicated Bill is so insular that the users of tech have no place to go and no voice.

While the Bill can be pushed through without a complaints system, this leaves it vulnerable. It takes only one incident or a sudden copycat rush of horrors, which have been ignored or trivialised by the sector with complainants finding themselves with nowhere to go but the press, to undermine confidence in the whole regulatory edifice.

So I have three questions for the Minister. The first two are on the VSP regime which, as was set out by the noble Baroness, Lady Newlove, is being cancelled by the Bill. First, could the Minister confirm to the Committee that the VSP complaints system has done nothing useful since it was put in place? Therefore, was the decision to repeal it based on its redundancy?

Secondly, if the system has indeed been deemed redundant, is that because of a failure of capacity or implementation by Ofcom—this is crucial for the Committee to understand, as Ofcom is about to take on the huge burden of this Bill—or is it because all the companies within the regime are now entirely compliant?

Thirdly, once a child or parent has exhausted a company’s complaints system, where, under the Bill in front of us, do the Government think they should go?

I have not yet heard from the noble Baroness, Lady Fox, on her amendments, so I reserve the right to violently agree with her later, but I simply do not understand her reasoning for scrapping super-complaints from the Bill. Over the last six days in Committee, the noble Baroness has repeatedly argued that your Lordships must be wary about putting too much power in the hands of the Government or the tech sector, yet here we have a mechanism that allows users a route to justice that does not depend on their individual wealth. Only those with deep pockets and the skin of a rhinoceros can turn to the law as individuals. A super-complaints system allows interested parties, whether from the kids sector or the Free Speech Union, to act on behalf of a group. As I hope I have made clear, this is additional to, not instead of, an individual complaints system, and I very much hoped to have the noble Baroness’s support for both.

My Lords, I also put my name to Amendments 250A and 250B, but the noble Baronesses, Lady Newlove and Lady Kidron, have done such a good job that I shall be very brief.

To understand the position that I suspect the Government may put forward, I suggest one looks at Commons Hansard and the discussion of this in the seventh Committee sitting of 9 June last year. To read it is to descend into an Alice in Wonderland vortex of government contradictions. The then Digital Minister—a certain Chris Philp, who, having been so effective as Digital Minister, was promoted, poor fellow, to become a Minister in the Home Office; I do not know what he did to deserve that—in essence said that, on the one hand, this problem is absolutely vast, and we all recognise that. When responding to the various Members of the Committee who put forward the case for an independent appeals mechanism, he said that the reason we cannot have one is that the problem is too big. So we recognise that the problem is very big, but we cannot actually do anything about it, because it is too big.

I got really worried because he later did something that I would advise no Minister in the current Government ever to do in public. Basically, he said that this

“groundbreaking and world-leading legislation”—[Official Report, Commons, Online Safety Bill Committee, 9/6/22; col. 296.]

will fix this. If I ruled the world, if any Minister in the current Government said anything like that, they would immediately lose the Whip. The track record of people standing up and proudly boasting how wonderful everything is going to be, compared to the evidence of what actually happens, is not a track record of which to be particularly proud.

I witnessed, as I am sure others did, the experience of the noble Baroness, Lady Kidron, pulling together a group of bereaved parents: families who had lost their child through events brought about by the online world. A point that has stayed with me from that discussion was the noble Baroness, Lady Kidron, who was not complaining, saying at the end that there is something desperately wrong with the system where she ends up as the point person to try to help these people resolve their enormous difficulties with these huge companies. I remind noble Lords that the family of Molly Russell, aided by a very effective lawyer, took no less than five years to get Meta to actually come up with what she was looking at online. So the most effective complaints process, or ombudsman, was the fact they were able to have a very able lawyer and an exceptionally able advocate in the shape of the noble Baroness, Lady Kidron, helping in any way she could. That is completely inadequate.

I looked at the one of the platforms that currently helps individual users—parents—trying to resolve some of the complaints they have with companies. It is incredibly complicated. So relying on the platforms themselves to bring forward, under the terms of the Bill, completely detailed systems and processes to ensure that these things do not happen, or that if there is a complaint it will be followed up dutifully and quickly, does not exactly fill me with confidence, based on their previous form.

For example, as a parent or an individual, here are some of the questions you might have to ask yourself. How do I report violence or domestic abuse online? How do I deal with eating disorder content on social media? How do I know what is graphic content that does not breach terms? How do I deal with abuse online? What do I do as a UK citizen if I live outside the UK? It is a hideously complex world out there. On the one hand, bringing in regulations to ensure that the platforms do what they are meant to, and on the other hand charging Ofcom to act as the policeman to make sure that they are actually doing it, is heaping yet more responsibility on Ofcom. The noble Lord, Lord Grade, is showing enormous stamina sitting up in the corner; he is sitting where the noble Lord, Lord Young, usually sits, which is a good way of giving the Committee a good impression.

What I would argue to the Minister is that to charge Ofcom with doing too much leads us into dangerous territory. The benefit of having a proper ombudsman who deals with these sorts of complaints week in, week out, is exactly the same argument as if one was going to have a hip or a knee replacement. Would you rather have it done by a surgical team that does it once a year or one that does it several hundred times a year? I do not know about noble Lords, but I would prefer the latter. If we had an effective ombudsman service that dealt with these platforms day in, day out, they would be the most effective individuals to identify whether or not those companies were actually doing what they are meant to do in the law, because they would be dealing with them day in, day out, and would see how they were responding. They could then liaise with Ofcom in real time to tell it if some platforms were not performing as they should. I feel that that would be more effective.

The only question I have for the Minister is whether he would please agree to meet with us between now and Report to really go into this in more detail, because this is an open goal which the Government really should be doing something to try to block. It is a bit of a no-brainer.

My Lords, I see my amendments as being probing. I am very keen on having a robust complaints system, including for individuals, and am open to the argument about an ombudsman. I am listening very carefully to the way that that has been framed. I tabled these amendments because while I know we need a robust complaints system—and I think that Ofcom might have a role in that—I would want that complaints system to be simple and as straightforward as possible. We certainly need somewhere that you can complain.

Ofcom will arguably be the most powerful regulator in the UK, effectively in charge of policing a key element of democracy: the online public square. Of course, one question one might ask is: how do you complain about Ofcom in the middle of it all? Ironically, an ombudsman might be somewhere where you would have to field more than just complaints about the tech companies.

I have suggested completely removing Clauses 150 to 152 from the Bill because of my reservations, beyond this Bill and in general, about a super-complaints system within the regulatory framework, which could be very unhelpful. I might be wrong, and I am open to correction if I have misunderstood, but the Bill’s notion of an eligible entity who will be allowed to make this complaint to Ofcom seems, at the moment, to be appointed on criteria set only by the Secretary of State. That is a serious problem. There is a danger that the Secretary of State could be accused of partiality or politicisation. We therefore have to think carefully about that.

I also object to the idea that certain organisations are anointed with extra legitimacy as super-complaints bodies. We have seen this more broadly. You will often hear Ministers say, in relation to consultations, “We’ve consulted stakeholders and civil society organisations”, when they are actually often referring to lobbying organisations with interests. There is a free-for-all for NGOs and interest groups. We think of a lot of charities as very positive but they are not necessarily neutral. I just wanted to query that.

There is also a danger that organisations will end up speaking on behalf of all women, all children or all Muslims. That is something we need to be careful about in a time of identity politics. We have seen it happen offline with self-appointed community leaders, but say, for example, there is a situation where there is a demand by a super-complainant to remove a particular piece of content that is considered to be harmful, such as an image of the Prophet Muhammad. These are areas where we have to admit that if people then say, “We speak on behalf of”, they will cause problems.

Although charities historically have had huge credibility, as I said, we know from some of the scandals that have affected charities recently that they are not always the saviours. They are certainly not immune from corruption, political bias, political disputes and so on.

I suppose my biggest concern is that the function in the Bill is not open to all members of the public. That seems to be a problem. Therefore, we are saying that certain groups and individuals will have a greater degree of influence over the permissibility of speech than others. There are some people who have understood these clauses to mean that it would act like a class action—that if enough people are complaining, it must be a problem. But, as noble Lords will know from their inboxes, sometimes one is inundated with emails and it does not necessarily show a righteous cause. I do not know about anyone else who has been involved in this Bill, but I have had exactly the same cut-and-paste email about violence against women and girls hundreds of times. That usually means a well-organised, sometimes well-funded, mass mobilisation. I have no objection, but just because you get lots of emails it does not mean that it is a good complaint. If you get only one important email complaint that is written by an individual, surely you should respect that minority view.

Is it not interesting that the assumption of speakers so far has been that the complaints will always be that harms have not been removed or taken notice of? I was grateful when the noble Baroness, Lady Kidron, mentioned the Free Speech Union and recognised, as I envisage, that many of the complaints will be about content having been removed—they will be free speech complaints. Often, in that instance, it will be an individual whose content has been removed. I cannot see how the phrasing of the Bill helps us in that. Although I am a great supporter of the Free Speech Union, I do not want it to represent or act on behalf of, say, Index on Censorship or even an individual who simply thinks that their content should not be removed—and who is no less valid than an official organisation, however much I admire it.

I certainly do not want individual voices to be marginalised, which I fear the Bill presently does in relation to complaints. I am not sure about an ombudsman; I am always wary of creating yet another more powerful body in the land because of the danger of over-bureaucratisation.

My Lords, I had to miss a few sessions of the Committee but I am now back until the end. I remind fellow Members of my interests: I worked for one of the largest platforms for a decade, but I have no current interests. It is all in the register if people care to look. I want to contribute to this debate on the basis of that experience of having worked inside the platforms.

I start by agreeing with the noble Baroness, Lady Kidron, the noble Lord, Lord Stevenson, and my noble friend Lord Clement-Jones. The thrust of their amendments—the idea that something will be needed here—is entirely correct. We have created in the Online Safety Bill a mechanism that we in this Committee know is intended primarily to focus on systems and how Ofcom regulates them, but what the public out there hear is that we are creating a mechanism that will meet their concerns—and their concerns will not end with systems. As the noble Baroness, Lady Newlove, eloquently described, their concerns in some instances will be about specific cases and the question will be: who will take those up?

If there is no other mechanism and no way to signpost people to a place where they can seek redress, they will come to Ofcom. That is something we do not want. We want Ofcom to be focused on the big-ticket items of dealing with systems, not bogged down in dealing with thousands of individual complaints. So we can anticipate a situation in which we will need someone to be able to deal with those individual complaints.

I want to focus on making that workable, because the volume challenge might not be as people expect. I have seen from having worked on the inside that there is a vast funnel of reports, where people report content to platforms. Most of those reports are spurious or vexatious; that is the reality. Platforms have made their reporting systems easy, as we want them to do —indeed, in the Bill we say, “Make sure you have really easy-to-use reporting systems”—but one feature of that is that people will use them simply to express a view. Over the last couple of weeks, all the platforms will have been inundated with literally millions of reports about Turkish politicians. These will come from the supporters of either side, reporting people on the other side—claiming that they are engaged in hate speech or pornography or whatever. They will use whatever tool they can. That is what we used to see day in, day out: football teams or political groups that report each other. The challenge is to separate out the signal—the genuinely serious reports of where something is going wrong—from the vast amount of noise, of people simply using the reporting system because they can. For the ombudsman, the challenge will be that signal question.

Breaking that down, from the vast funnel of complaints coming in, we have a smaller subset that are actionable. Some of those will be substantive, real complaints, where the individual simply disagrees with the decision. That could be primarily for two reasons. The first is that the platform has made a bad decision and failed to enforce its own policies. For example, you reported something as being pornographic, and it obviously was, but the operator was having a bad day—they were tired, it was late in the day and they pressed “Leave up” instead of “Take down”. That happens on a regular basis, and 1% of errors like that across a huge volume means a lot of mistakes being made. Those kinds of issues, where there is a simple operator error, should get picked up by the platforms’ own appeal mechanisms. That is what they are there for, and the Bill rightly points to that. A second reviewer should look at it. Hopefully they are a bit fresher, understand that a mistake was made and can simply reverse it. Those operator error reports can be dealt with internally.

The second type would be where the platform enforces policies correctly but, from the complainant’s point of view, the policies are wrong. It may be a more pro-free speech platform where the person says, “This is hate speech”, but the platform says, “Well, according to our rules, it is not. Under our terms of service, we permit robust speech of this kind. Another platform might not, but we do”. In that case, the complainant is still unhappy but the platform has done nothing wrong—unless the policies the platform is enforcing are out of step with the requirements under the Online Safety Bill, in which case the complaint should properly come to Ofcom. Based on the individual complaint, a complainant may have something material for Ofcom. They are saying that they believe the platform’s policies and systems are not in line with the guidance issued by Ofcom—whether on hate speech, pornography or anything else. That second category of complaint would come to Ofcom.

The third class concerns the kind of complaint that the noble Baroness, Lady Newlove, described. In some ways, this is the hardest. The platform has correctly enforced its policies but, in a particular case, the effect is deeply unfair, problematic and harmful for an individual. The platform simply says, “Look, we enforced the policies. They are there. This piece of content did not violate them”. Any outsider looking at it would say, “There is an injustice here. We can clearly see that an individual is being harmed. A similar piece of content might not be harmful to another individual, but to this individual it is”. In those circumstances, groups such as the South West Grid for Learning, with which I work frequently, perform an invaluable task. We should recognise that there is a network of non-governmental organisations in the United Kingdom that do this day in, day out. Groups such as the Internet Watch Foundation and many others have fantastic relations and connections with the platforms and regularly bring exceptional cases to them.

We are glad to have the noble Lord back. I want also to put on the record that the South West Grid for Learning is very supportive of this amendment.

It has let me know as well. In a way, the amendment seeks to formalise what is already an informal mechanism. I was minded initially to support Amendment 56 in the name of my noble friend Lord Clement-Jones and the noble Lord, Lord Stevenson.

This landscape is quite varied. We have to create some kind of outlet, as the noble Baroness, Lady Kidron, rightly said. That parent or individual will want to go somewhere, so we have to send them somewhere. We want that somewhere to be effective, not to get bogged down in spurious and vexatious complaints. We want it to have a high signal-to-noise ratio—to pull out the important complaints and get them to the platforms. That will vary from platform to platform. In some ways, we want to empower Ofcom to look at what is and is not working and to be able to say, “Platform A has built up an incredible set of mechanisms. It’s doing a good job. We’re not seeing things falling through the cracks in the same way as we are seeing with platform B. We are going to have to be more directive with platform B”. That very much depends on the information coming in and on how well the platforms are doing their job already.

I hope that the Government are thinking about how these individual complaints will be dealt with and about the demand that will be created by the Bill. How can we have effective mechanisms for people in the United Kingdom who genuinely have hard cases and have tried, but where there is no intermediary for the platform they are worried about? In many cases, I suspect that these will be newer or smaller platforms that have arrived on the scene and do not have established relationships. Where are these people to go? Who will help them, particularly in cases where the platform may not systemically be doing anything wrong? Its policies are correct and it is enforcing them correctly, but any jury of peers would say that an injustice is being done. Either an exception needs to be made or there needs to be a second look at that specific case. We are not asking Ofcom to do this in the rest of the legislation.

My Lords, it is always somewhat intimidating to follow the noble Lord, Lord Allan, though it is wonderful to have him back from his travels. I too will speak in favour of Amendments 250A and 250B in the name of my noble friend, from not direct experience in the social media world but tangentially, from telecoms regulation.

I have lived, as the chief executive of a business, in a world where my customers could complain to me but also to an ombudsman and to Ofcom. I say this with some hesitation, as my dear old friends at TalkTalk will be horrified to hear me quoting this example, but 13 years ago, when I took over as chief executive, TalkTalk accounted for more complaints to Ofcom than pretty much all the other telcos put altogether. We were not trying to be bad—quite the opposite, actually. We were a business born out of very rapid growth, both organic and acquisitive, and we did not have control of our business at the time. We had an internal complaints process and were trying our hardest to listen to it and to individual customers who were telling us that we were letting them down, but we were not doing that very well.

While my noble friend has spoken so eloquently about the importance of complaints mechanisms for individual citizens, I am actually in favour of them for companies. I felt the consequences of having an independent complaints system that made my business listen. It was a genuine failsafe system. For someone to have got as far as complaining to the telecoms ombudsman and to Ofcom, they had really lost the will to live with my own business. That forced my company to change. It has forced telecoms companies to change so much that they now advertise where they stand in the rankings of complaints per thousand customers. Even in the course of the last week, Sky was proclaiming in its print advertising that it was the least complained-about to the independent complaints mechanism.

So this is not about thinking that companies are bad and are trying to let their customers down. As the noble Lord, Lord Allan, has described, managing these processes is really hard and you really need the third line of defence of an independent complaints mechanism to help you deliver on your best intentions. I think most companies with very large customer bases are trying to meet those customers’ needs.

For very practical reasons, I have experienced the power of these sorts of systems. There is one difference with the example I have given of telecoms: it was Ofcom itself that received most of those complaints about TalkTalk 13 years ago, and I have tremendous sympathy with the idea that we might unleash on poor Ofcom all the social media complaints that are not currently being resolved by the companies. That is exactly why, as Dame Maria Miller said, we need to set up an independent ombudsman to deal with this issue.

From a very different perspective from that of my noble friend, I struggle to understand why the Government do not want to do what they have just announced they want to do in other sectors such as gambling.

My Lords, I had better start by declaring an interest. It is a great pleasure to follow the noble Baroness, Lady Harding, because my interest is directly related to the ombudsman she has just been praising. I am chairman of the board of the Trust Alliance Group, which runs the Energy Ombudsman and the telecoms ombudsman. The former was set up under the Consumers, Estate Agents and Redress Act 2007 and the latter under the Communications Act 2003.

Having got that off my chest, I do not have to boast about the efficacy of ombudsmen; they are an important institution, they take the load off the regulator to a considerable degree and they work closely with the participating companies in the schemes they run. On balance, I would prefer the Consumers, Estate Agents and Redress Act scheme because it involves a single ombudsman, but both those ombudsmen demonstrate the benefit in their sectors.

The noble Lord, Lord Stevenson, pretty much expressed the surprise that we felt when we read the Government’s response to what we thought was a pretty sensible suggestion in the Joint Committee’s report. He quoted it, and I am going to quote it again because it is such an extraordinary statement:

“An independent resolution mechanism such as an Ombudsman is relatively untested in areas of non-financial harm”.

If you look at the ones for which I happen to have some responsibility, and at the other ombudsmen— there is a whole list we could go through: the Legal Ombudsman, the Local Government and Social Care Ombudsman, the Parliamentary and Health Service Ombudsman—there are a number who are absolutely able to take a view on non-financial matters. It is a bit flabbergasting, if that is a parliamentary expression, to come across that kind of statement in a government response.

There has been distilled wisdom during the course of this debate. Although there may be differences of view about whether we have half a loaf or a full loaf, what is clear is that we are all trying to head in the same direction, which is to have an ombudsman for complaints in this sector. We need to keep reminding everybody that this is not a direct complaints system: it is a secondary complaints system, and you have to have exhausted the complaints within the social media platform. My noble friend described some of the complexity of that extremely well, and I thank the noble Baroness, Lady Kidron, for setting out some of the complexities and the views of the expert group.

I mentioned the Government’s response to the Joint Committee, but as the noble Baroness, Lady Newlove, said, we already have an independent appeals system in the video-sharing platform legislation. Why are we going backwards in this Bill? We should be being more comprehensive as a result of this. This Bill is set to dismantle an essential obligation that supports victims of online harm on video-sharing platforms. We have to be more comprehensive, not less. The South West Grid for Learning’s independent appeals process, which has been mentioned already today, highlights that a significant number of responses received by victims of harmful content from industry platforms were initially incorrect, and RHC was able to resolve them.

Some of these misunderstandings are not necessarily complaints that need adjudication; sometimes it is actually miscommunication. We have heard during the debate that other countries are already doing this; several noble Lords have mentioned Ireland, Australia and New Zealand. We need that ability also, and this is where I disagree with the noble Baroness, Lady Fox, although there was a nuance in her argument. It was a probing amendment—how about that? The fact that representative organisations can defend users’ rights for largescale breaches of the law is very important, and I was rather surprised by her criticism of the fact that lobby groups can bring action. Orchestrating complaints is the nature of group or class actions in litigation, so the question is begging to be tested. Those complaints need to be tested, and it is perfectly legitimate for a group to bring those complaints.

The noble Baroness, Lady Newlove, mentioned the research published by the Children’s Commissioner which showed that 40% of children did not report harmful content because they felt there was no point in doing so. That is pretty damning of the current situation. The noble Lord, Lord Russell, and my noble friend made the very strong point that we do not want to bog down the regulator. We see what happens under the data protection legislation. The ICO has an enormous number of complaints to deal with directly, without the benefit of an ombudsman. This scheme could alleviate the burden on the regulator and be highly effective. I do not think we have heard an argument in Committee against this; it must be the way forward. I very much hope that the Minister will take this forward after today and install an ombudsman for the Bill.

My Lords, the amendments in this group are concerned with complaints mechanisms. I turn first to Amendment 56 from the noble Lord, Lord Stevenson of Balmacara, which proposes introducing a requirement on Ofcom to produce an annual review of the effectiveness and efficiency of platforms’ complaints procedures. Were this review to find that regulated services were not complying effectively with their complaints procedure duties, the proposed new clause would provide for Ofcom to establish an ombudsman to provide a dispute resolution service in relation to complaints.

While I am of course sympathetic to the aims of this amendment, the Government remain confident that service providers are best placed to respond to individual user complaints, as they will be able to take appropriate action promptly. This could include removing content, sanctioning offending users, reversing wrongful content removal or changing their systems and processes. Accordingly, the Bill imposes a duty on regulated user-to-user and search services to establish and operate an easy-to-use, accessible and transparent complaints procedure. The complaints procedure must provide for appropriate action to be taken by the provider in relation to the complaint.

It is worth reminding ourselves that this duty is an enforceable requirement. Where a provider is failing to comply with its complaints procedure duties, Ofcom will be able to take enforcement action against the regulated service. Ofcom has a range of enforcement powers, including the power to impose significant penalties and confirmation decisions that can require the provider to take such steps as are required for compliance. In addition, the Bill includes strong super-complaints provisions that will allow for concerns about systemic issues to be raised with the regulator, which will be required to publish its response to the complaint. This process will help to ensure that Ofcom is made aware of issues that users are facing.

Separately, individuals will also be able to submit complaints to Ofcom. Given the likelihood of an overwhelming volume of complaints, as we have heard, Ofcom will not be able to investigate or arbitrate on individual cases. However, those complaints will be an essential part of Ofcom’s horizon-scanning, research, supervision and enforcement activity. They will guide Ofcom in deciding where to focus its attention. Ofcom will also have a statutory duty to conduct consumer research about users’ experiences in relation to regulated services and the handling of complaints made by users to providers of those services. Further, Ofcom can require that category 1, 2A and 2B providers set out in their annual transparency reports the measures taken to comply with their duties in relation to complaints. This will further ensure that Ofcom is aware of any issues facing users in relation to complaints processes.

At the same time, I share the desire expressed to ensure that the complaints mechanisms will be reviewed and assessed. That is why the Bill contains provisions for the Secretary of State to undertake a review of the efficacy of the entire regulatory framework. This will take place between two and five years after the Part 3 provisions come into force, which is a more appropriate interval for the efficacy of the duties around complaints procedures to be reviewed, as it will allow time for the regime to bed in and provide a sufficient evidence base to assess whether changes are needed.

Finally, I note that Amendment 56 assumes that the preferred solution following a review will be an ombudsman. There is probably not enough evidence to suggest that an ombudsman service would be effective for the online safety regime. It is unclear how an ombudsman service would function in support of the new online safety regime, because individual user complaints are likely to be complex and time-sensitive—and indeed, in many cases financial compensation would not be appropriate. So I fear that the noble Lord’s proposed new clause pre-empts the findings of a review with a solution that is resource-intensive and may be unsuitable for this sector.

Amendments 250A and 250B, tabled by my noble friend Lady Newlove, require that an independent appeals system is established and that Ofcom produces guidance to support this system. As I have set out, the Government believe that decisions on user redress and complaints are best dealt with by services. Regulated services will be required to operate an easy-to-use, accessible and transparent complaints procedure that enables users to make complaints. If services do not comply with these duties, Ofcom will be able to utilise its extensive enforcement powers to bring them into compliance.

The Government are not opposed to revisiting the approach to complaints once the regime is up and running. Indeed, the Bill provides for the review of the regulatory framework. However, it is important that the new approach, which will radically change the regulatory landscape by proactively requiring services to have effective systems and processes for complaints, has time to bed in before it is reassessed.

Turning specifically to the points made by my noble friend and by the noble Baroness, Lady Kidron, about the impartial out of court dispute resolution procedure in the VSP, the VSP regime and the Online Safety Bill are not directly comparable. The underlying principles of both regimes are of course the same, with the focus on systems regulation and protections for users, especially children. The key differences are regarding the online safety framework’s increased scope. The Bill covers a wider range of harms and introduces online safety duties on a wider range of platforms. Under the online safety regime, Ofcom will also have a more extensive suite of enforcement powers than under the UK’s VSP regime.

On user redress, the Bill goes further than the VSP regime as it will require services to offer an extensive and effective complaints process and will enable Ofcom to take stronger enforcement action where they fail to meet this requirement. That is why the Government have put the onus of the complaints procedure on the provider and set out a more robust approach which requires all in-scope, regulated user to user and search services to offer an effective complaints process that provides for appropriate action to be taken in relation to the complaint. This will be an enforceable duty and will enable Ofcom to utilise its extensive online safety enforcement powers where services are not complying with their statutory duty to provide a usable, accessible and transparent complaints procedure.

At the same time, we want to ensure that the regime can develop and respond to new challenges. That is why we have included a power for the Secretary of State to review the regulatory framework once it is up and running. This will provide the correct mechanism to assess whether complaint handling mechanisms can be further strengthened once the new regulations have had time to bed in.

The Government are confident that the Online Safety Bill represents a significant step forward in keeping users safe online for these reasons.

My Lords, could I just ask a question? This Bill has been in gestation for about five to six years, during which time the scale of the problems we are talking about has increased exponentially. The Government appear to be suggesting that they will, in three to five years, evaluate whether or not their approach is working effectively.

There was a lot of discussion in this Chamber yesterday about the will of the people and whether the Government were ignoring it. I gently suggest that the very large number of people, who are having all sorts of problems or who are fearful of harm from the online world, will not find in the timescale that the Government are proposing the sort of remedy and speed of action I suspect they were hoping for. Certainly, the rhetoric the Government have used and continue to use at regular points in the Bill when they are slightly on the back foot seems to be designed to try to make the situation seem better than it is.

Will the Minister and the Bill team take on board that there are some very serious concerns that there will be a lot of lashing back at His Majesty’s Government if in three years’ time—which I fear may be the case—we still have a situation where a large body of complaints are not being dealt with? Ofcom is going to suffer from major ombudsman-like constipation trying to deal with this, and the harms will continue. I think I speak for the Committee when I say that the arguments the Minister and the government side are making really do not hold water.

I thought in particular of the direct experience of the noble Baroness, Lady Harding, demonstrating the effect on her company—so substitute platforms for that—of knowing that you are being held to account. Having a system that helps the regulator understand in real time whether or not these companies are doing what they should—they are an early warning system and would know earlier than Ofcom would—just seems sensible. But perhaps being sensible is not what this Bill is about.

I do not know about that last point. I was going to say that I am very happy to meet the noble Lord to discuss it. It seems to me to come down to a matter of timing and the timing of the first review. As I say, I am delighted to meet the noble Lord. By the way, the relevant shortest period is two years not three, as he said.

Following on from my friend, the noble Lord, Lord Russell, can I just say to the Minister that I would really welcome all of us having a meeting? As I am listening to this, I am thinking that three to five years is just horrific for the families. This Bill has gone on for so long to get where we are today. We are losing sight of humanity here and the moral compass of protecting human lives. For whichever Government is in place in three to five years to make the decision to say it does not work is absolutely shameful. Nobody in the Government will be accountable and yet for that family, that single person may commit suicide. We have met the bereaved families, so I say to the Minister that we need to go round the table and look at this again. I do not think it is acceptable to say that there is this timeline, this review, for the Secretary of State when we are dealing with young lives. It is in the public interest to get this Bill correct as it navigates its way back to the House of Commons in a far better state than how it arrived.

I would love the noble Viscount to answer my very specific question about who the Government think families should turn to when they have exhausted the complaints system in the next three to five years. I say that as someone who has witnessed successive Secretaries of State promising families that this Bill would sort this out. Yes?

It is between two and five years. It can be two; it can be five. I am very happy to meet my noble friend and to carry on doing so. The complaints procedure set up for families is to first approach the service provider in an enforceable manner and should the provider fail to meet its enforceable duties to then revert to Ofcom before the courts.

I am sorry but that is exactly the issue at stake. The understanding of the Committee currently is that there is then nowhere to go if they have exhausted that process. I believe that complainants are not entitled to go to Ofcom in the way that the noble Viscount just suggested.

Considerably more rights are provided than they have today, with the service provider. Indeed, Ofcom would not necessarily deal with individual complaints—

What recourse would they have, if Ofcom will not deal with individual complaints in those circumstances?

I am happy to meet and discuss this. We are expanding what they are able to receive today under the existing arrangements. I am happy to meet any noble Lords who wish to take this forward to help them understand this—that is probably best.

Amendments 287 and 289 from the noble Baroness, Lady Fox of Buckley, seek to remove the provision for super-complaints from the Bill. The super-complaints mechanism is an important part of the Bill’s overall redress mechanisms. It will enable entities to raise concerns with Ofcom about systemic issues in relation to regulated services, which Ofcom will be required to respond to. This includes concerns about the features of services or the conduct of providers creating a risk of significant harm to users or the public, as well as concerns about significant adverse impacts on the right to freedom of expression.

On who can make super-complaints, any organisation that meets the eligibility criteria set out in secondary legislation will be able to submit a super-complaint to Ofcom. Organisations will be required to submit evidence to Ofcom, setting out how they meet these criteria. Using this evidence, Ofcom will assess organisations against the criteria to ensure that they meet them. The assessment of evidence will be fair and objective, and the criteria will be intentionally strict to ensure that super-complaints focus on systemic issues and that the regulator is not overwhelmed by the number it receives.

To clarify and link up the two parts of this discussion, can the Minister perhaps reflect, when the meeting is being organised, on the fact that the organisations and the basis on which they can complain will be decided by secondary legislation? So we do not know which organisations or what the remit is, and we cannot assess how effective that will be. We know that the super-complainants will not want to overwhelm Ofcom, so things will be bundled into that. Individuals could be excluded from the super-complaints system in the way that I indicated, because super-complaints will not represent everyone, or even minority views; in other words, there is a gap here now. I want that bit gone, but that does not mean that we do not need a robust complaints system. Before Report at least—in the meetings in between—the Government need to advise on how you complain if something goes wrong. At the moment, the British public have no way to complain at all, unless someone sneaks it through in secondary legislation. This is not helpful.

Again, I am just pulling this together—I am curious to understand this. We have been given a specific case—South West Grid for Learning raising a case based on an individual but that had more generic concerns—so could the noble Viscount clarify, now or in writing, whether that is the kind of thing that he imagines would constitute a super-complaint? If South West Grid for Learning went to a platform with a complaint like that—one based on an individual but brought by an organisation—would Ofcom find that complaint admissible under its super-complaints procedure, as imagined in the Bill?

Overall, the super-complaints mechanism is more for groupings of complaints and has a broader range than the individual complaints process, but I will consider that point going forward.

Many UK regulators have successful super-complaints mechanisms which allow them to identify and target emerging issues and effectively utilise resources. Alongside the Bill’s research functions, super-complaints will perform a vital role in ensuring that Ofcom is aware of the issues users are facing, helping them to target resources and to take action against systemic failings.

On the steps required after super-complaints, the regulator will be required to respond publicly to the super-complaint. Issues raised in the super-complaint may lead Ofcom to take steps to mitigate the issues raised in the complaint, where the issues raised can be addressed via the Bill’s duties and powers. In this way, they perform a vital role in Ofcom’s horizon-scanning powers, ensuring that it is aware of issues as they emerge. However, super-complaints are not linked to any specific enforcement process.

My Lords, it has just occurred to me what the answer is to the question, “Where does an individual actually get redress?” The only way they can get redress is by collaborating with another 100 people and raising a super-complaint. Is that the answer under the Bill?

No. The super-complaints mechanism is better thought of as part of a horizon-scanning mechanism. It is not—

The answer to the noble Lord’s question is that the super-complaint is not a mechanism for individuals to complain on an individual basis and seek redress.

This is getting worse and worse. I am tempted to suggest that we stop talking about this and try to, in a smaller group, bottom out what we are doing. I really think that the Committee deserves a better response on super-complaints than it has just heard.

As I understood it—I am sure that the noble Baroness, Lady Kidron, is about to make the same point—super-complaints are specifically designed to take away the pressure on vulnerable and younger persons to have responsibility only for themselves in bringing forward the complaint that needs to be resolved. They are a way of sharing that responsibility and taking away the pressure. Is the Minister now saying that that is a misunderstanding?

I understand that the Minister has been given a sticky wicket of defending the indefensible. I welcome a meeting, as I think the whole Committee does, but it would be very helpful to hear the Government say that they have chosen to give individuals no recourse under the Bill—that this is the current situation, as it stands, and that there is no concession on the matter. I have been in meetings with people who have been promised such things, so it is really important, from now on in Committee, that we actually state at the Dispatch Box what the situation is. I spent quite a lot of the weekend reading circular arguments, and we now need to get to an understanding of what the situation is. We can then decide, as a Committee, what we do in relation to that.

As I said, I am very happy to hold the meeting. We are giving users greater protection through the Bill, and, as agreed, we can discuss individual routes to recourse.

I hope that, on the basis of what I have said and the future meeting, noble Lords have some reassurance that the Bill’s complaint mechanisms will, eventually, be effective and proportionate, and feel able not to press their amendments.

I am very sorry that I did not realise that the Minister was responding to this group of amendments; I should have welcomed him to his first appearance in Committee. I hope he will come back—although he may have to spend a bit of time in hospital, having received a pass to speak on this issue from his noble friend.

This is a very complicated Bill. The Minister and I have actually talked about that over tea, and he is now learning the hard lessons of what he took as a light badinage before coming to the Chamber today. However, we are in a bit of a mess here. I was genuinely trying to get an amendment that would encourage the department to move forward on this issue, because it is quite clear from the mood around the Committee that something needs to be resolved here. The way the Government are approaching this is by heading towards a brick wall, and I do not think it is the right way forward.

The Minister cannot ignore the evidence from the two very well-respected practitioners who have been involved in this sort of process and understand how it works that this is not the way forward. He has heard somebody who works professionally in this area explain how the system works in practice. He is hearing from individuals who, as we have now discovered, otherwise have nowhere to go. We are being told what seems to be a very confusing story about what the super-complaints system is about and how it will be done. This must be sorted, otherwise he will find that the Great British public, for whom the Bill is designed, particularly younger people, will turn around and say, “This is what you promised us?” They will not believe it and they will not like it.

All I heard coming through from the debate is that Ofcom will pick up a lot of complaints and use them to inform itself about what should happen five years down the track, the next time that the regulatory review takes place. That is not what we are about here. This is about filling a gap in a system for which promises have been issued over the seven long years that we have been waiting for the Bill. People out there expect the Bill to make their lives much more reasonable and to be respectful of their rights and responsibilities.

We find that the VSP provisions are being deleted—a system we already have, which at least does first approximation work. We find that we are reinforcing the inequality of arms between individuals and companies. We find that DCMS—it is not the same department because the Minister is now in DSIT, but it is his former sister department—is creating an ombudsman for gambling problems, having identified that they have gone too far, too fast, and are now out of control and need to be responded to. This just does not add up. The Government are in a mess. Please sort it. I beg leave to withdraw the amendment.

Amendment 56 withdrawn.

Clause 18: Duties about freedom of expression and privacy

Amendment 57 not moved.

Amendment 58

Moved by

58: Clause 18, page 20, line 32, at end insert “as defined under the Human Rights Act 1998 and its application to the United Kingdom.”

My Lords, I am delighted to propose this group of amendments on devolution issues. I am always delighted to see the Committee so full to talk about devolution issues. I will speak particularly to Amendments 58, 136, 225A and 228 in this group, all in my name. I am very grateful to the noble Lord, Lord Foulkes of Cumnock, for supporting them.

As I have said before in Committee, I have looked at the entire Bill from the perspective of a devolved nation, in particular at the discrepancies and overlaps of Scots law, UK law and ECHR jurisprudence that I was concerned had not been taken into account or addressed by the Bill as it stands. Many have said that they are not lawyers; I am also not. I am therefore very grateful to the Law Society of Scotland, members of Ofcom’s Advisory Committee for Scotland, and other organisations such as the Carnegie Trust and Legal to Say, Legal to Type, which have helped formulate my thinking. I also thank the Minister and the Bill team for their willingness to discuss these issues in advance with me.

When the first proposed Marshalled List for this Committee was sent round, my amendments were dotted all over the place. When I explained to the Whips that they were all connected to devolved issues and asked that they be grouped together, that must have prompted the Bill team to go and look again; the next thing I know, there is a whole raft of government amendments in this group referring to Wales, Northern Ireland, the Bailiwick of Guernsey and the Isle of Man—though not Scotland, I noted. These government amendments are very welcome; if nothing else, I am grateful to have achieved that second look from the devolved perspective.

In the previous group, we heard how long the Bill had been in gestation. I have the impression that, because online safety decision-making is a centralised and reserved matter, the regions are overlooked and engaged only at a late stage. The original internet safety Green Paper made no reference to Scotland at all; it included a section on education describing only the English education system and an annexe of legislation that did not include Scottish legislation. Thankfully, this oversight was recognised by the White Paper, two years later, which included a section on territorial scope. Following this, the draft Bill included a need for platforms to recognise the differences in legislation across the UK, but this was subsequently dropped.

I remain concerned that the particular unintended consequences of the Bill for the devolved Administrations have not been fully appreciated or explored. While online safety is a reserved issue, many of the matters that it deals with—such as justice, the police or education —are devolved, and, as many in this House appreciate, Scots law is different.

At the moment, the Bill is relatively quiet on how freedom of expression is defined; how it applies to the providers of user-to-user services and their duties to protect users’ rights to freedom of expression; and how platforms balance those competing rights when adjudicating on content removal. My Amendment 58 has similarities to Amendment 63 in the name of the noble and learned Lord, Lord Hope of Craighead. It seeks to ensure that phrases such as “freedom of expression” are understood in the same way across the United Kingdom. As the noble and learned Lord pointed out when speaking to his Amendment 63 in a previous group, words matter, and I will therefore be careful to refer to “freedom of expression” rather than “freedom of speech” throughout my remarks.

Amendment 58 asks the Government to state explicitly which standards of speech platforms apply in each of the jurisdictions of the UK, because at this moment there is a difference. I accept that the Human Rights Act is a UK statute already, but, under Article 10—as we have heard—freedom of expression is not an absolute right and may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law.

The noble Lord, Lord Moylan, argued last week that the balance between freedom of expression and any condition or restriction was not an equal one but was weighted in favour of freedom of expression. I take this opportunity to take some issue with my noble friend, who is not in his place, on this. According to the Equality and Human Rights Commission, the British Institute of Human Rights and Supreme Court judgments, human rights are equal and indivisible, neither have automatic priority, and how they are balanced depends on the context and the particular facts.

In Scotland, the Scottish Government believe that they are protecting freedom of expression, but the Hate Crime and Public Order (Scotland) Act 2021 criminalises speech that is not illegal elsewhere in the UK. Examples from the Scottish Government’s own information note state that it is now an offence in Scotland

“if the urging of people to cease practising their religion is done in a threatening or abusive manner or, alternatively, … if a person were to urge people not to engage in same-sex sexual activity while making abusive comments about people who identify as lesbian, gay or bisexual”.

The Lord Advocate’s guidance to the police says that

“an incident must be investigated as a hate crime if it is perceived, by the victim or any other person, to be aggravated by prejudice”.

I stress that I make no absolutely comment about the merits, or otherwise, of the Hate Crime and Public Order (Scotland) Act. I accept that it is yet to be commenced. However, commencement is in the hands of the Scottish Parliament, not the Minister and his team, and I highlight it here as an illustration of the divergence of interpretation that is happening between the devolved nations now, and as an example of what could happen in the future.

So, I would have thought that we would want to take a belt-and-braces approach to ensuring that there cannot be any differences in interpretation of what we mean by freedom of expression, and I hope that the Minister will accept my amendment for the sake of clarity. Ofcom is looking for clarity wherever possible, and clarity will be essential for platforms. Amendment 58 would allow platforms to interpret freedom of expression as a legal principle, rather than having to adapt considerations for Scotland, and it would also help prevent Scottish users’ content being censored more than that of English users, as platforms could rely on a legally certain basis for decision-making.

The hate crime Act was also the motivation for my Amendment 136, which asks why the Government did not include it on the list of priority offences in Schedule 7. I understand that the Scottish Government did not ask for it to be included, but since when did His Majesty’s Government do what the Scottish Government ask of them?

I have assumed that the Scottish Government did not ask for it because the hate crime Act is yet to be commenced in Scotland and there are, I suspect, multiple issues to be worked out with Police Scotland and others before it can be. I stress again that it is not my intention that the Hate Crime and Public Order (Scotland) Act should dictate the threshold for illegal and priority illegal content in this Bill—Amendment 136 is a probing amendment—but the omission of the hate crime Act does raise the question of a devolution deficit because, while the definition of “illegal content” varies, people in areas of the UK with more sensitive thresholds would have to rely on the police to enforce some national laws online rather than benefiting from the additional protections of the Ofcom regime.

Clause 53(5)(c) of this Bill states that

“the offence is created by this Act or, before or after this Act is passed, by”—

this is in sub-paragraph (iv)—

“devolved subordinate legislation made by a devolved authority with the consent of the Secretary of State or other Minister of the Crown”.

How would this consent be granted? How would it involve this Parliament? What consultation should be required, and with whom—particularly since the devolved offence might change the thresholds for the offence across the whole of the UK? The phrase “consent of the Secretary of State” implies that a devolved authority would apply to seek consent. Should not this application process be set out in the Bill? What should the consultation process with devolved authorities and Ofcom be if the Secretary of State wishes to initiate the inclusion of devolved subordinate legislation? Do we not need a formal framework for parliamentary scrutiny—an equivalent of the Grimstone process, perhaps? I would be very happy to work with the Minister and his team on a Parkinson process between now and Report.

Amendments 225A and 228 seek to ensure that there is an analysis of users’ online experiences in the different nations of the UK. Amendment 225A would require Ofcom to ensure that its research into online experiences was analysed in a nation-specific way while Amendment 228 would require Ofcom’s transparency reporting to be reported via each nation. The fact is that, at this moment in time, we do not know whether there is a difference in the online experience across the four nations. For example, are rural or remote communities at greater risk of online harm because they have a greater dependence on online services? How would online platforms respond to harmful sectarian content? What role do communication technologies play in relation to offline violence, such as knife crime?

We can compare other data by nation, for example on drug use or gambling addiction. Research and transparency reporting are key to understanding nation-specific harms online, but I fear that Ofcom will have limited powers in this area if they are not specified in the Bill. Ofcom has good working relationships from the centre with the regions, and part of this stems from the fact that legislation in other sectors, such as broadcasting, requires it to have advisory committees in each of the nations to ensure that English, Scottish, Northern Irish and Welsh matters are considered properly. Notably, those measures do not exist in this Bill.

The interplay between the high-level and reserved nature of internet services and online safety will require Ofcom to develop a range of new, wider partnerships in Scotland—for example with Police Scotland—and to collaborate closely at a working level with a wide range of interests within the Scottish Government, where such interests will be split across a range of ministerial portfolios. In other areas of its regulatory responsibility, Ofcom’s research publications provide a breakdown of data by nation. Given the legislative differences that already exist between the four nations, it is an omission that such a breakdown is not explicitly required in the Bill.

I have not touched—and I am not going to touch—on how this Bill might affect other devolved Administrations. The noble Baroness, Lady Foster of Aghadrumsee, apologises for being unable to be in the Chamber to lend her voice from a Northern Ireland perspective— I understand from her that the Justice (Sexual Offences and Trafficking Victims) Act (Northern Ireland) 2022 might be another example of this issue—but she has indicated her support here. As my noble friend Lady Morgan of Cotes said last Thursday:

“The Minister has done a very good job”

of

“batting away amendments”.—[Official Report, 11/5/23; col. 2043.]

However, I am in an optimistic mood this afternoon, because the Minister responded quite positively to the request from the noble and learned Lord, Lord Hope, that we should define “freedom of expression”. There is great benefit to be had from ensuring that this transparency of reporting and research can be broken down by nation. I am hopeful, therefore, that the Minister will take the points that I have raised through these amendments and that he will, as my noble friend Lady Morgan of Cotes hoped, respond by saying that he sees my points and will work with me to ensure that this legislation works as we all wish it to across the whole of the UK. I beg to move.

My Lords, I warmly support the amendment moved by the noble Baroness, Lady Fraser of Craigmaddie, to which I have added my name. I agree with every word she said in her introduction. I could not have said it better and I have nothing to add.

My Lords, I follow the noble Lord, Lord Foulkes, with just a few words. As we have been reminded, I tabled Amendment 63, which has already been debated. The Minister will remember that my point was about legal certainty; I was not concerned with devolution, although I mentioned Amendment 58 just to remind him that we are dealing with all parts of the United Kingdom in the Bill and it is important that the expression should have the same meaning throughout all parts.

We are faced with the interesting situation which arose in the strikes Bill: the subject matter of the Bill is reserved, but one must have regard to the fact that its effects spread into devolved areas, which have their own systems of justice, health and education. That is why there is great force in the point that the noble Baroness, Lady Fraser, has been making. I join the noble Lord, Lord Foulkes, in endorsing what she said without going back into the detail, but remind the Minister that devolution exists, even though we are dealing with reserved matters.

My Lords, this is unfamiliar territory for me, but the comprehensive introduction of the noble Baroness, Lady Fraser, has clarified the issue. I am only disappointed that we had such a short speech from the noble Lord, Lord Foulkes—uncharacteristic, perhaps I could say—but it was good to hear from the noble and learned Lord, Lord Hope, on this subject as well. The noble Baroness’s phrase “devolution deficit” is very useful shorthand for some of these issues. She has raised a number of questions about the Secretary of State’s powers under Clause 53(5)(c): the process, the method of consultation and whether there is a role for Ofcom’s national advisory committees. Greater transparency in order to understand which offences overlap in all this would be very useful. She deliberately did not go for one solution or another, but issues clearly arise where the thresholds are different. It would be good to hear how the Government are going to resolve this issue.

My Lords, it is a pity that we have not had the benefit of hearing from the Minister, because a lot of his amendments in this group seem to bear on some of the more generic points made in the very good speech by the noble Baroness, Lady Fraser. I assume he will cover them, but I wonder whether he would at least be prepared to answer any questions people might come back with—not in any aggressive sense; we are not trying to scare the pants off him before he starts. For example, the points made by the noble Lord, Lord Clement-Jones, intrigue me.

I used to have responsibility for devolved issues when I worked at No. 10 for a short period. It was a bit of a joke, really. Whenever anything Welsh happened, I was immediately summoned down to Cardiff and hauled over the coals. You knew when you were in trouble when they all stopped speaking English and started speaking Welsh; then, you knew there really was an issue, whereas before I just had to listen, go back and report. In Scotland, nobody came to me anyway, because they knew that the then Prime Minister was a much more interesting person to talk to about these things. They just went to him instead, so I did not really learn very much.

I noticed some issues in the Marshalled List that I had not picked up on when I worked on this before. I do not know whether the Minister wishes to address this—I do not want to delay the Committee too much—but are we saying that to apply a provision in the Bill to the Bailiwick of Guernsey or the Isle of Man, an Order in Council is required to bypass Parliament? Is that a common way of proceeding in these places? I suspect that the noble and learned Lord, Lord Hope, knows much more about this than I do—he shakes his head—but this is a new one on me. Does it mean that this Parliament has no responsibility for how its laws are applied in those territories, or are there other procedures of which we are unaware?

My second point again picks up what the noble Lord, Lord Clement-Jones, was saying. Could the Minister go through in some detail the process by which a devolved authority would apply to the Secretary of State—presumably for DSIT—to seek consent for a devolved offence to be included in the Online Safety Bill regime? If this is correct, who grants to what? Does this come to the House as a statutory instrument? Is just the Secretary of State involved, or does it go to the Privy Council? Are there other ways that we are yet to know about? It would be interesting to know.

To echo the noble Lord, Lord Clement-Jones, we probably do need a letter from the Minister, if he ever gets this cleared, setting out exactly how the variation in powers would operate across the four territories. If there are variations, we would like to know about them.

My Lords, I am very grateful to my noble friend Lady Fraser of Craigmaddie for her vigilance in this area and for the discussion she had with the Bill team, which they and I found useful. Given the tenor of this short but important debate, I think it may be helpful if we have a meeting for other noble Lords who also want to benefit from discussing some of these things in detail, and particularly to talk about some of the issues the noble Lord, Lord Stevenson of Balmacara, just raised. It would be useful for us to talk in detail about general questions on the operation of the law before we look at this again on Report.

In a moment, I will say a bit about the government amendments which stand in my name. I am sure that noble Lords will not be shy in taking the opportunity to interject if questions arise, as they have not been shy on previous groups.

I will start with the amendments tabled by my noble friend Lady Fraser. Her Amendment 58 seeks to add reference to the Human Rights Act 1998 to Clause 18. That Act places obligations on public authorities to act compatibly with the European Convention on Human Rights. It does not place obligations on private individuals and companies, so it would not make sense for such a duty on internet services to refer to the Human Rights Act.

Under that Act, Ofcom has obligations to act in accordance with the right to freedom of expression under Article 10 of the European Convention on Human Rights. As a result, the codes that Ofcom draws up will need to comply with the Article 10 right to freedom of expression. Schedule 4 to the Bill requires Ofcom to ensure that measures which it describes in a code of practice are designed in light of the importance of protecting the right of users’

“freedom of expression within the law”.

Clauses 44(2) and (3) provide that platforms will be treated as complying with their freedom of expression duty if they take the recommended measures that Ofcom sets out in the codes. Platforms will therefore be guided by Ofcom in taking measures to comply with its duties, including safeguards for freedom of expression through codes of practice.

My noble friend’s Amendment 136 seeks to add offences under the Hate Crime and Public Order (Scotland) Act 2021 to Schedule 7. Public order offences are already listed in Schedule 7 to the Bill, which will apply across the whole United Kingdom. This means that all services in scope will need proactively to tackle content that amounts to an offence under the Public Order Act 1986, regardless of where the content originates or where in the UK it can be accessed.

The priority offences list has been developed with the devolved Administrations, and Clause 194 outlines the parliamentary procedures for updating it. The requirements for consent will be set out in the specific subordinate legislation that may apply to the particular offence being made by the devolved authorities—that is to say, they will be laid down by the enabling statutes that Parliament will have approved.

Amendment 228 seeks to require the inclusion of separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in Ofcom’s transparency reports. These transparency reports are based on the information requested from category 1, 2A and 2B service providers through transparency reporting. I assure my noble friend that Ofcom is already able to request country-specific information from providers in its transparency reports. The legislation sets out high-level categories of information that category 1, 2A and 2B services may be required to include in their transparency reports. The regulator will set out in a notice the information to be requested from the provider, the format of that information and the manner in which it should be published. If appropriate, Ofcom may request specific information in relation to each country in the UK, such as the number of users encountering illegal content and the incidence of such content.

Ofcom is also required to undertake consultation before producing guidance about transparency reporting. In order to ensure that the framework is proportionate and future-proofed, however, it is vital to allow the regulator sufficient flexibility to request the types of information that it sees as relevant, and for that information to be presented by providers in a manner that Ofcom has deemed to be appropriate.

Similarly, Amendment 225A would require separate analyses of users’ online experiences in England, Wales, Scotland and Northern Ireland in Ofcom’s research about users’ experiences of regulated services. Clause 141 requires that Ofcom make arrangements to undertake consumer research to ascertain public opinion and the experiences of UK users of regulated services. Ofcom will already be able to undertake this research on a country-specific basis. Indeed, in undertaking its research and reporting duties, as my noble friend alluded to, Ofcom has previously adopted such an approach. For instance, it is required by the Communications Act 2003 to undertake consumer research. While the legislation does not mandate that Ofcom conduct and publish nation-specific research, Ofcom has done so, for instance through its publications Media Nations and Connected Nations. I hope that gives noble Lords some reassurance of its approach in this regard. Ensuring that Ofcom has flexibility in carrying out its research functions will enable us to future-proof the regulatory framework, and will mean that its research activity is efficient, relevant and appropriate.

I will now say a bit about the government amendments standing in my name. I should, in doing so, highlight that I have withdrawn Amendments 304C and 304D, previously in the Marshalled List, which will be replaced with new amendments to ensure that all the communications offences, including the new self-harm offence, have the appropriate territorial extent when they are brought forward. They will be brought forward as soon as possible once the self-harm offence has been tabled.

Amendments 267A, 267B, 267C, 268A, 268B to 268G, 271A to 271D, 304A, 304B and 304E are amendments to Clauses 160, 162, 164 to 166, 168 and 210 and Schedule 14, relating to the extension of the false and threatening communications offences and the associated liability of corporate officers in Clause 166 to Northern Ireland.

This group also includes some technical and consequential amendments to the false and threatening communications offences and technical changes to the Malicious Communications (Northern Ireland) Order 1988 and Section 127 of the Communications Act 2003. This will minimise overlap between these existing laws and the new false and threatening communications offences in this Bill. Importantly, they mirror the approach taken for England and Wales, providing consistency in the criminal law.

This group also contains technical amendments to update the extent of the epilepsy trolling offence to reflect that it applies to England, Wales and Northern Ireland.

Amendment 286B is a technical amendment to repeal a provision in the Digital Economy Act 2017 that will become redundant when Part 3 of that Act is repealed by this Bill.

Amendments 304F and 304G give the Bailiwick of Guernsey and the Isle of Man the power to extend the Online Safety Bill to their jurisdictions, should they wish. Amendments 304A and 304H to 304K have been tabled to reflect the Bailiwick of Jersey opting to forgo a permissive extent clause in this instance.

With the offer of a broader meeting to give other noble Lords the benefit of the discussions with the Bill team that my noble friend has had—I extend that invitation to her, of course, to continue the conversation with us—I hope that provides information about the government amendments in this group and some reassurance on the points that my noble friend has made. I hope that she will be willing to withdraw her amendment and that noble Lords will accept the government amendments.

I suggested that we might see a table, independent of the meetings, although I am sure they could coincide. Would it be possible to have a table of all the criminal offences that the Minister listed and how they apply in each of the territories? Without that, we are a bit at sea as to exactly how they apply.

Yes, that would be a sensible way to view it. We will work on that and allow noble Lords to see it before they come to talk to us about it.

I put on record that the withdrawal of Part 3 of the Digital Economy Act 2017 will be greeted with happiness only should the full schedule of AV and harms be put into the Bill. I must say that because the noble Baroness, Lady Benjamin, is not in her place. She worked very hard for that piece of legislation.

My Lords, I thank the Minister for his response. I take it as a win that we have been offered a meeting and further discussion, and the noble Lord, Lord Foulkes, agreeing with every word I said. I hope we can continue in this happy vein in my time in this House.

The suggestion from the noble Lord, Lord Stevenson, of a table is a welcome one. Something that has interested me is that some of the offences the Minister mentioned were open goals: there were holes leaving it open in Northern Ireland and not in England and Wales, or whatever. For example, epilepsy trolling is already a criminal offence in Scotland, but I am not sure that was appreciated when we started this discussion.

I look forward to the meeting and I thank the Minister for his response. I am still unconvinced that we have the right consultation process for any devolved authority wanting to apply for a subordinate devolved Administration to be included under this regime.

It concerns me that the Minister talked about leaving requesting data that Ofcom deemed to be appropriate. The feeling on the ground is that Ofcom, which is based in London, may not understand what is or is not necessarily appropriate in the devolved Administrations. The fact that in other legislation—for example, on broadcasting—it is mandated that it is broken down nation by nation is really important. It is even more important because of the interplay between the devolved and the reserved matters. The fact that there is no equivalent Minister in the Scottish Government to talk about digital and online safety things with means that a whole raft of different people will need to have relationships with Ofcom who have not hitherto.

I thank the Minister. On that note, I withdraw my amendment.

Amendment 58 withdrawn.

Amendments 59 to 64 not moved.

Clause 18 agreed.

Clause 19: Record-keeping and review duties

Amendments 64A to 64D

Moved by

64A: Clause 19, page 21, line 36, leave out “all”

Member’s explanatory statement

This is a technical amendment needed because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 1 services.

64B: Clause 19, page 21, line 36, at end insert “(as indicated by the headings).”

Member’s explanatory statement

This amendment provides clarification because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 1 services.

64C: Clause 19, page 21, line 38, after “of” insert “all aspects of”

Member’s explanatory statement

This amendment concerns a duty imposed on providers to keep records of risk assessments. The added words make it clear that full records must be made.

64D: Clause 19, page 21, line 38, at end insert “, including details about how the assessment was carried out and its findings.”

Member’s explanatory statement

This amendment concerns a duty imposed on providers to keep records of risk assessments. The added words make it clear that full records must be made.

Amendments 64A to 64D agreed.

Amendments 65 and 65ZA not moved.

Amendment 65A

Moved by

65A: Clause 19, page 22, line 26, at end insert—

“(8A) As soon as reasonably practicable after making a record of a risk assessment as required by subsection (2), or revising such a record, a duty to supply OFCOM with a copy of the record (in full).”Member’s explanatory statement

This amendment requires providers of Category 1 services to supply copies of their records of risk assessments to OFCOM. The limitation to Category 1 services is achieved by an amendment in the name of the Minister to clause 6.

Amendment 65A agreed.

Amendment 65AA not moved.

Clause 19, as amended, agreed.

Clause 20: Providers of search services: duties of care

Amendments 65B to 65E

Moved by

65B: Clause 20, page 23, line 5, leave out “and (3)” and insert “to (3A)”

Member’s explanatory statement

This technical amendment is consequential on the other changes to clause 20 (arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only - see the amendments in the Minister’s name to those clauses below).

65C: Clause 20, page 23, line 10, at end insert “(2) to (8)”

Member’s explanatory statement

This amendment is consequential on the amendments in the Minister’s name to clause 23 below (because the new duty to summarise illegal content risk assessments in a publicly available statement is only imposed on providers of Category 2A services).

65D: Clause 20, page 23, line 15, at end insert “(2) to (6)”

Member’s explanatory statement

This amendment is consequential on the amendments in the Minister’s name to clause 29 below (because the new duty to supply records of risk assessments to OFCOM is only imposed on providers of Category 2A services).

65E: Clause 20, page 23, line 15, at end insert—

“(2A) Additional duties must be complied with by providers of particular kinds of regulated search services, as follows.”Member’s explanatory statement

This technical amendment is consequential on the other changes to clause 20 (arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only - see the amendments in the Minister’s name to those clauses below).

Amendments 65B to 65E agreed.

Amendment 66 not moved.

Amendments 66A to 66D

Moved by

66A: Clause 20, page 23, line 16, leave out “In addition,”

Member’s explanatory statement

This technical amendment is consequential on the other changes to clause 20 (arising from the new duties in clauses 23, 25 and 29 which are imposed on providers of Category 2A services only - see the amendments in the Minister’s name to those clauses below).

66B: Clause 20, page 23, line 20, at end insert “(2) to (8)”

Member’s explanatory statement

This amendment is consequential on the amendments in the Minister’s name to clause 25 below (because the new duty to summarise children’s risk assessments in a publicly available statement is only imposed on providers of Category 2A services).

66C: Clause 20, page 23, line 20, at end insert—

“(3A) All providers of regulated search services that are Category 2A services must comply with the following duties in relation to each such service which they provide—(a) the duty about illegal content risk assessments set out in section 23(8A),(b) the duty about children’s risk assessments set out in section 25(8A), and(c) the duty about record-keeping set out in section 29(8A).”Member’s explanatory statement

This amendment ensures that the new duties set out in the amendments in the Minister’s name to clauses 23, 25 and 29 below (duties to summarise risk assessments in a publicly available statement and to supply records of risk assessments to OFCOM) are imposed on providers of Category 2A services only.

66D: Clause 20, page 23, line 21, at end insert—

“(5) For the meaning of “Category 2A service”, see section 86 (register of categories of services).”Member’s explanatory statement

This amendment inserts a signpost to the meaning of “Category 2A service”.

Amendments 66A to 66D agreed.

Clause 20, as amended, agreed.

Clause 21 agreed.

Clause 22: Illegal content risk assessment duties

Amendment 66DA not moved.

Amendment 66E

Moved by

66E: Clause 22, page 24, line 38, after “29(2)” insert “and (8A)”

Member’s explanatory statement

This amendment inserts a signpost to the new duty in clause 29 about supplying records of risk assessments to OFCOM.

Amendment 66E agreed.

Clause 22, as amended, agreed.

Clause 23: Safety duties about illegal content

Amendments 66F and 66G

Moved by

66F: Clause 23, page 24, line 42, leave out “all”

Member’s explanatory statement

This is a technical amendment needed because the new duty to summarise illegal content risk assessments in a publicly available statement (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.

66G: Clause 23, page 24, line 42, at end insert “(as indicated by the headings).”

Member’s explanatory statement

This amendment provides clarification because the new duty to summarise illegal content risk assessments in a publicly available statement (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.

Amendments 66F and 66G agreed.

Amendments 67 to 72 not moved.

Amendment 72A

Moved by

72A: Clause 23, page 25, line 31, at end insert—

“(8A) A duty to summarise in a publicly available statement the findings of the most recent illegal content risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to individuals).”Member’s explanatory statement

This amendment requires providers of Category 2A services to summarise (in a publicly available statement) the findings of their latest risk assessment regarding illegal content. The limitation to Category 2A services is achieved by an amendment in the name of the Minister to clause 20.

Amendment 72A agreed.

Clause 23, as amended, agreed.

Amendment 73 not moved.

Clause 24: Children’s risk assessment duties

Amendments 74 and 75 not moved.

Amendment 75A

Moved by

75A: Clause 24, page 26, line 45, after “29(2)” insert “and (8A)”

Member’s explanatory statement

This amendment inserts a signpost to the new duty in clause 29 about supplying records of risk assessments to OFCOM.

Amendment 75A agreed.

Clause 24, as amended, agreed.

Clause 25: Safety duties protecting children

Amendment 75B

Moved by

75B: Clause 25, page 27, line 4, at end insert “(as indicated by the headings).”

Member’s explanatory statement

This amendment provides clarification because the new duty to summarise children’s risk assessments in a publicly available statement (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.

Amendment 75B agreed.

Amendments 76 to 81 not moved.

Amendment 81A

Moved by

81A: Clause 25, page 27, line 46, at end insert—

“(8A) A duty to summarise in a publicly available statement the findings of the most recent children’s risk assessment of a service (including as to levels of risk and as to nature, and severity, of potential harm to children).”Member’s explanatory statement

This amendment requires providers of Category 2A services to summarise (in a publicly available statement) the findings of their latest children’s risk assessment. The limitation to Category 2A services is achieved by an amendment in the name of the Minister to clause 20.

Amendment 81A agreed.

Amendments 82 to 85 not moved.

Clause 25, as amended, agreed.

Clause 26: Duty about content reporting

Amendment 86 not moved.

Clause 26 agreed.

Clause 27: Duties about complaints procedures

Amendment 87 not moved.

Clause 27 agreed.

Clause 28: Duties about freedom of expression and privacy

Amendment 88 not moved.

Clause 28 agreed.

Clause 29: Record-keeping and review duties

Amendments 88A to 88D

Moved by

88A: Clause 29, page 31, line 4, leave out “all”

Member’s explanatory statement

This is a technical amendment needed because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.

88B: Clause 29, page 31, line 4, at end insert “(as indicated by the headings).”

Member’s explanatory statement

This amendment provides clarification because the new duty to supply records of risk assessments to OFCOM (see the amendment in the Minister’s name inserting new subsection (8A) below) is imposed only on providers of Category 2A services.

88C: Clause 29, page 31, line 6, after “of” insert “all aspects of”

Member’s explanatory statement

This amendment concerns a duty imposed on providers to keep records of risk assessments. The added words make it clear that full records must be made.

88D: Clause 29, page 31, line 6, at end insert “, including details about how the assessment was carried out and its findings.”

Member’s explanatory statement

This amendment concerns a duty imposed on providers to keep records of risk assessments. The added words make it clear that full records must be made.

Amendments 88A to 88D agreed.

Amendments 89 and 90 not moved.

Amendment 90A

Moved by

90A: Clause 29, page 31, line 37, at end insert—

“(8A) As soon as reasonably practicable after making a record of a risk assessment as required by subsection (2), or revising such a record, a duty to supply OFCOM with a copy of the record (in full).”Member’s explanatory statement

This amendment requires providers of Category 2A services to supply copies of their records of risk assessments to OFCOM. The limitation to Category 2A services is achieved by an amendment in the name of the Minister to clause 20.

Amendment 90A agreed.

Amendment 90B not moved.

Clause 29, as amended, agreed.

Amendments 91 and 91A not moved.

Clause 30: Children’s access assessments

Amendment 92 not moved.

Clause 30 agreed.

Clause 31 agreed.

Schedule 3 agreed.

Amendment 93 not moved.

Clause 32 agreed.

Clause 33: Duties about fraudulent advertising: Category 1 services

Amendment 94 not moved.

Clause 33 agreed.

Clause 34: Duties about fraudulent advertising: Category 2A services

Amendment 95 not moved.

Clause 34 agreed.

Clause 35 agreed.

Amendment 96

Moved by

96: After Clause 35, insert the following new Clause—

“Suicide or self-harm content duties

(1) This section sets out the duties about harmful suicide or self-harm content which apply to all regulated user-to-user services and providers of search services.(2) This section applies in respect of all service users.(3) A duty to include provisions in the terms of service specifying the treatment to be applied in relation to harmful suicide or self-harm content.(4) The possible kinds of treatment of content referred to in subsection (3) are—(a) taking down the content;(b) restricting users’ access to the content;(c) limiting the recommendation or promotion of the content.(5) A duty to explain in the terms of service the provider’s response to the risks relating to harmful suicide or self- harm content by reference to—(a) any provisions of the terms of service included in compliance with the duty set out in subsection (3), and(b) any other provisions of the terms of service designed to mitigate or manage those risks.(6) If provisions are included in the terms of service in compliance with the duty set out in subsection (3), a duty to ensure that those provisions—(a) are clear and accessible, and(b) are applied consistently in relation to content which meets the definition in section 207.”Member’s explanatory statement

This creates a duty for providers of regulated user-to-user services and search services to manage harmful suicide or self-harm content, applicable to both children and adults.

I am particularly grateful to the noble Lords who co-signed Amendments 96, 240 and 296 in this group. Amendment 225 is also important and warrants careful consideration, as it explicitly includes eating disorders. These amendments have strong support from Samaritans, which has helped me in drafting them, and from the Mental Health Foundation and the BMA. I declare that I am an elected member of the BMA ethics committee.

We have heard much in Committee about the need to protect children online more effectively even than in the Bill. On Tuesday the noble Baroness, Lady Morgan of Cotes, made a powerful speech acknowledging that vulnerability does not stop at the age of 18 and that the Bill currently creates a cliff edge whereby there is protection from harmful content for those under 18 but not for those over 18. The empowerment tools will be futile for those seriously contemplating suicide and self-harm. No one should underestimate the power of suicide contagion and the addictive nature of the content that is currently pushed out to people, goading them into such actions and drawing them into repeated viewings.

Amendment 96 seeks to redress that. It incorporates a stand-alone provision, creating a duty for providers of user-to-user services to manage harmful content about suicide or self-harm. This provision would operate as a specific category, relevant to all regulated services and applicable to both children and adults. Amendment 296 defines harmful suicide or self-harm content. It is important that we define that to avoid organisations such as Samaritans, which provide suicide prevention support, being inadvertently caught up in clumsy, simplistic search engine categorisation.

Suicide and self-harm content affects people of all ages. Adults in distress search the internet, and children easily bypass age-verification measures and parental controls even when the have been switched on. The Samaritans Lived Experience Panel reported that 82% of people who died by suicide, having visited websites that encouraged suicide and/or methods of self-harm, were over the age of 25.

Samaritans considers that the types of suicide and self-harm content that are legal but unequivocally harmful include, but are not limited to, information, depictions, instructions and advice on methods of self-harm and suicide; content that portrays self-harm and suicide as positive or desirable; and graphic descriptions or depictions of self-harm and suicide. As the Bill stands, platforms will not even need to consider the risk that such content could pose to adults. This will leave all that dangerous online content widely available and undermines the Bill’s intention from the outset.

Last month, other parliamentarians and I met Melanie, whose relative Jo died by suicide in 2020. He was just 23. He had accessed suicide-promoting content online, and his family are speaking out to ensure that the Bill works to avoid future tragedies. A University of Bristol study reported that those with severe suicidal thoughts actively use the internet to research effective methods and often find clear suggestions. Swansea University reported that three quarters of its research participants had harmed themselves more severely after viewing self-harm content online.

Amendment 240 complements the other amendments in this group, although it would not rely on them to be effective. It would establish a specific unit in Ofcom to monitor the prevalence of suicide, self-harm and harmful content online. I should declare that this is in line with the Private Member’s Bill I have introduced. In practice, that means that Ofcom would need to assess the efficacy of the legislation in practice. It would require Ofcom to investigate the content and the algorithms that push such content out to individuals at an alarming rate.

Researchers at the Center for Countering Digital Hate set up new accounts in the USA, UK, Canada and Australia at the minimum age TikTok allows, which is 13. These accounts paused briefly on videos about body image and mental health, and “liked” them. Within 2.6 minutes, TikTok recommended suicide content, and it sent content on eating disorders within eight minutes.

Ofcom’s responsibility for ongoing review and data collection, reported to Parliament, would take a future-facing approach covering new technologies. New communications and internet technologies are being developed at pace in ways we cannot imagine. The term

“in a way equivalent … to”

in Amendment 240 is specifically designed to include the metaverse, where interactions are instantaneous, virtual and able to incite, encourage or provoke serious harm to others.

We increasingly live our lives online. Social media is expanding, while user-to-user sites are now shopping platforms for over 70% of UK consumers. However, online is also being used to sell suicide kits or lethal substances, as recently covered in the press. It is important that someone holds the responsibility for reporting on dangers in the online world. Harmful suicide content methods and encouragement were found through a systematic review to be massed on sites with low levels of moderation and easy search functions for images. Some 78% of people with lived experience of suicidality and self-harm surveyed by Samaritans agree that new laws are needed to make online spaces safer.

I urge noble Lords to support my amendments, which aim to ensure that self-harm, suicide and seriously harmful content is addressed across all platforms in all categories as well as search engines, regardless of their functionality or reach, and for all persons, regardless of age. Polling by Samaritans has shown high support for this: four out of five agree that harmful suicide and self-harm content can damage adults as well as children, while three-quarters agree that tech companies should by law prevent such content being shown to users of all ages.

If the Government are not minded to adopt these amendments, can the Minister tell us specifically how the Bill will take a comprehensive approach to placing duties on all platforms to reduce dangerous content promoting suicide and self-harm? Can the Government confirm that smaller sites, such as forums that encourage suicide, will need to remove priority illegal content, whatever the level of detail in their risk assessment? Lastly—I will give the Minister a moment to note my questions—do the Government recognise that we need an amendment on Report to create a new offence of assisting or encouraging suicide and serious self-harm? I beg to move.

My Lords, I particularly support Amendment 96, to which I have added my name; it is a privilege to do so. I also support Amendment 296 and I cannot quite work out why I have not added my name to it, because I wholeheartedly agree with it, but I declare my support now.

I want to talk again about an issue that the noble Baroness, Lady Finlay, set out so well and that we also touched on last week, about the regulation of suicide and self-harm content. We have all heard of the tragic case of Molly Russell, but a name that is often forgotten in this discussion is Frankie Thomas. Frankie was a vulnerable teenager with childhood trauma, functioning autism and impulsivity. After reading a story about self-harm on the app Wattpad, according to the coroner’s inquest, she went home and undertook

“a similar act, resulting in her death”.

I do not need to repeat the many tragic examples that have already been shared in this House, but I want to reiterate the point already made by the BMA in its very helpful briefing on these amendments: viewing self-harm and suicide content online can severely harm the user offline. As I said last week when we were debating the user empowerment tools, this type of content literally has life or death repercussions. It is therefore essential that the Bill takes this sort of content more seriously and creates specific duties for services to adhere to.

We will, at some point this evening—I hope—come on to debate the next group of amendments. The question for Ministers to answer on this group, the next one and others that we will be debating is, where we know that content is harmful to society—to individuals but also to broader society—why the Government do not want to take the step of setting out how that content should be properly regulated. I think it all comes from their desire to draw a distinction between content that is illegal and content that is not illegal but is undoubtedly, in the eyes of pretty well every citizen, deeply harmful. As we have already heard from the noble Baroness, and as we heard last week, adults do not become immune to suicide and self-harm content the minute they turn 18. In fact, I would argue that no adult is immune to the negative effects of viewing this type of content online.

This amendment, therefore, is very important, as it would create a duty for providers of regulated user-to-user services and search engines to manage harmful suicide or self-harm content applicable to both children and adults, recognising this cliff edge otherwise in the Bill, which we have already talked about. I strongly urge noble Lords, particularly the Minister, to agree that protecting users from this content is one of the most important things that the Bill can do. People outside this House are looking to us to do this, so I urge the Government to support this amendment today.

My Lords, I am pleased that we have an opportunity, in this group of amendments, to talk about suicide and self-harm content, given the importance of it. It is important to set out what we expect to happen with this legislation. I rise particularly to support Amendment 225, to which my noble friend Lady Parminter added her name. I am doing this more because the way in which this kind of content is shared is incredibly complex, rather than simply because of the question of whether it is legal or illegal.

Our goal in the regulations should actually be twofold. First, we do, of course, want to reduce the likelihood that somebody who is at lower risk of suicide and self-harm might move into the higher-risk group as a result of their online activity on user-to-user services. That is our baseline goal. We do not want anyone to go from low risk to high risk. Secondly, as well as this “create no new harm” goal, we have a harm reduction goal, which is that people who are already at a higher risk of suicide and self-harm might move into a lower-risk category through the use of online services to access advice and support. It is really important, in this area, that we do not lose sight of the fact there are two aspects to the use of online services. It is no simple task to try to achieve both these goals, as they can sometimes be in tension with each other in respect of particular content types.

There is a rationale for removing all suicide and self-harm content, as that is certainly a way to achieve that first goal. It makes it less likely that a low-risk person would encounter—in the terms of the Bill—or be exposed to potentially harmful content. Some countries certainly do take that approach. They say that anything that looks like it is encouraging suicide or self-harm should be removed, full stop. That is a perfectly rational and legitimate approach.

There is, however, a cost to this approach, which I wish to tease out. It would be helpful in this debate to understand that, and it might not be immediately apparent what that cost is. It is that there are different kinds of individuals posting this content, so if we look at the experience of what happens on online platforms, there certainly is a community of people who post content with the express aim of hurting others: people who we often call trolls, who are small in number but incredibly toxic. They are putting out suicide and self-harm content because they want other people to suffer. They might think it is funny, but whatever they think, they are doing it with an expressly negative intent.

There is also a community of individuals and organisations who believe that they are sharing content to help those who are at risk. This can vary: some can be formal organisations such as the Samaritans, and others can be enterprising individuals, sometimes people who themselves had experiences that they wish to share, who will create online fora and share content. It might be content that looks similar to content that appears harmful, but their expressed goal is seeking to help others online. Most of these fora are for that purpose. Then there are the individuals themselves, who are looking for advice and support relevant to what is happening in their own lives and to connect with others who share their experiences.

We might see the same piece of content very differently when posted by people in these groups. If an individual in that troll group is showing an image of self-harm, that is an aggressive, harmful act; there is no excuse for it, and we want to get rid of it. However, the same content might be part of an educational exchange when posted by an expert organisation. The noble Baroness, Lady Finlay, said that we needed to make sure that this new legislation did not inadvertently sweep up those who were in that educational space.

The hardest group is the group of individuals, where, in many cases, the posting of that content is a cry for help, and an aggressive response by the platform can, sadly, be counterproductive to that individual if they have gone online to seek help. The effect of that is that the content is removed and, because they violated the platform’s terms of service, that person who is feeling lonely and vulnerable might lose social media accounts that are important to them for seeking help. Therefore, by seeking to reduce their exposure to content, we might inadvertently end up creating a scenario in which they lose all that is valuable to them. That is the other inadvertent harm that we want to ensure we avoid in regulating and seeking to have Ofcom issue the most appropriate guidance.

We should be able to advance both goals: removing the content that is posted with harmful intent but enabling content that is there as a cry for help, or as a support and advice service. It is in that context that something like the proposal for an expert group for Ofcom is very helpful. Again, having worked at a platform, I can say that we often reached out to advisers and sought help. Sometimes, the advice was conflicting. Some people would say it was really important that if someone was sharing images of self-harm they should be got rid of; others would say that, in certain contexts, it was really important to allow that person to share the image of self-harm and have a discussion with others—and that maybe the response was to use that as a trigger, to point them towards a support service that they need.

Again, when somebody is at imminent risk of suicide, protocols were developed to deal with that when the solution is nothing that the platform can do. If a platform has detected that somebody is at imminent risk of suicide, it needs to find a way to ensure that either a support body such as the Samaritans or, in many cases, the police are notified so that they can go to that person’s house, knock on the door and prevent the suicide happening. Platforms in some countries have the relationships that they need with local bodies. Giving that advice is very sensitive; you are disclosing highly sensitive personal data to an outside body, against the individual’s wishes. There will not be consent from them, in many cases, and that has to be worked through.

If we are thinking about protocols for dealing with self-harm content, we will reach some of the same issues. It may be that informing parents, a school or some other body to get help to that individual would be the right thing to do. That is very sensitive in terms of the data disclosure and privacy aspects.

The Bill is an opportunity to improve all of this. There are pieces of very good practice and, clearly, areas where not enough is being done and too much very harmful content—particularly content that is posted with the express intent of causing harm—is being allowed to circulate. I hope that, through the legislation and by getting these protocols right, we can get to the point where we are both preventing lower-risk people moving into a higher-risk category and enabling people already in a high-risk category to get the help, support and advice that they need. Nowadays, online is often the primary tool that could benefit them.

My Lords, as usual, the noble Lord, Lord Allan of Hallam, has explained with some nuance the trickiness of this area, which at first sight appears obvious—black and white—but is not quite that. I want to explore some of these things.

Many decades ago, I ran a drop-in centre for the recovering mentally ill. I remember my shock the first time that I came across a group of young women who had completely cut themselves up. It was a world that I did not know but, at that time, a very small world—a very minor but serious problem in society. Decades later, going around doing lots of talks, particularly in girls’ schools where I am invited to speak, I suddenly discovered that whole swathes of young women were doing something that had been considered a mental health problem, often hidden away. Suddenly, people were talking about a social contagion of self-harm happening in the school. Similarly, there were discussions about eating disorders being not just an individual mental health problem but something that kind of grew within a group.

Now we have the situation with suicide sites, which are phenomenal at exploiting those vulnerabilities. This is undoubtedly a social problem of some magnitude. I do not in any way want to minimise it, but I am not sure exactly how legislation can resolve it or whether it will, even though I agree that it could do certain things.

Some of the problems that we have, which have already been alluded to, really came home to me when I read about Instagram bringing in some rules on self-harm material, which ended up with the removal of posts by survivors of self-harm discussing their illness. I read a story in the Coventry Evening Telegraph—I was always interested because I ran the drop-in centre for the recovering mentally ill in Coventry—where a young woman had some photographs taken down from Instagram because they contained self-harm images of her scars. The young woman who had suffered these problems, Marie from Tile Hill, wanted to share pictures of her scars with other young people because she felt that they would help others recover. She had got over it and was basically saying that the scars were healing. In other words, it was a kind of self-help group for users online, yet it was taken down.

It is my usual problem: this looks to be a clear-cut case, yet the complexities can lead to problems of censorship of a sort. I was really pleased that the noble Baroness, Lady Finlay, stressed the point about definitions. Search engines such as Google have certainly raised the problem of a concern, or worry, that people looking for help—or even looking to write an essay on suicide, assisted suicide or whatever—will end up not being able to find appropriate material.

I also want to ask a slightly different question. Who decides which self-harms are in our definitions and what is the contagion? When I visit schools now, there is a new social contagion in town, I am afraid to say, which is that of gender dysphoria. In the polling for a newly published report Show, Tell and Leave Nothing to the Imagination by Jo-Anne Nadler, which has just come out, half of the young people interviewed said that they knew someone at their school who wanted to change gender or had already, while one in 10 said that they wanted to change their gender.

That is just an observation; your Lordships might ask what it has to do with the Bill. But these are actually problem areas that are being affirmed by educational organisations and charities, by which I mean that organisations that have often worked with government have been consulted as stakeholders. They have recommended to young women where online to purchase chest binders, which will stop them developing, or where and how to use puberty blockers. Eventually, they are affirming double mastectomies or castration. By the way, this is of course all over social media, because once you start to search on it, TikTok is rife with it. Algorithmically, we are reminded all the time to think about systems: once you have had a look at it, it is everywhere. My point is that this is affirmed socially.

Imagine a situation whereby, in society offline, some young woman who has an eating disorder and weighs 4 stone comes to you and says “Look, I’m so fat”. If you said, “Yes, of course you’re fat—I’ll help you slim”, we would think it was terrible. When that happens online, crudely and sometimes cruelly, we want to tackle it here. If some young woman came to you and said, “I want to self-harm; I feel so miserable that I want to cut myself”, and you started recommending blades, we would think it was atrocious behaviour. In some ways, that is what is happening online and that is where I have every sympathy with these amendments. Yet when it comes to gender dysphoria, which actually means encouraging self-harm, because it is a cultural phenomenon that is popular it does not count.

In some ways, I could be arguing that we should future-proof this legislation by including those self-harms in the definition put forward by the amendments in this group. However, I raise it more to indicate that, as with all definitions, it is not quite as easy as one would think. I appreciate that a number of noble Lords, and others engaged in this discussion, might think that I am merely exhibiting prejudice rather than any genuine compassion or concern for those young people. I would note that if noble Lords want to see a rising group of people who are suicidal and feel that their life is really not worth living, search out the work being done on detransitioners who realise too late that that affirmation by adults has been a disaster for them.

On the amendments suggesting another advisory committee with experts to advise Ofcom on how we regulate such harms, I ask that we are at least cautious about which experts. To mention one of the expert bodies, Mermaids has become controversial and has actually been advocating some of those self-harms, in my opinion. It is now subject to a Charity Commission investigation but has been on bodies such as this advising about young people. I would not think that appropriate, so I just ask that some consideration is given to which experts would be on such bodies.

My Lords, I also support the amendments in the name of my noble friend Lady Finlay. I want to address a couple of issues raised by the noble Lord, Lord Allan. He made a fantastic case for adequate redress systems, both at platform level and at independent complaint level, to really make sure that, at the edge of all the decisions we make, there is sufficient discussion about where that edge lies.

The real issue is not so much the individuals who are in recovery and seeking to show leadership but those who are sent down the vortex of self-harm and suicide material that comes in its scores—in its hundreds and thousands—and completely overwhelms them. We must not make a mistake on the edge case and not deal with the issue at hand.

There is absolutely not enough signposting. I have seen at first hand—I will not go through it again; I have told the Committee already—companies justifying material that it was inconceivable to justify as being a cry for help. A child with cuts and blood running down their body is not a cry for help; that is self-harm material.

From experience, I think it is true that companies get defensive and seek to defend the indefensible on occasion. I agree with the noble Baroness on that, but I will balance it a little as I also work with people who were agonising over not wanting to make a bad situation worse. They were genuinely struggling and seeking to do the right thing. That is where the experts come in. If someone would say to them, “Look, take this stuff down; that is always better”, it would make their lives easier. If they said, “Please leave it up”, they could follow that advice. Again, that would make their lives easier. On the excuses, I agree that sometimes they are defending the indefensible, but also there are people agonising over the right thing to do and we should help them.

I absolutely agree. Of course, good law is a good system, not a good person.

I turn to the comments that I was going to make. Uncharacteristically, I am a little confused about this issue and I would love the Minister’s help. My understanding on reading the Bill very closely is that self-harm and suicide content that meets a legal definition will be subject to the priority illegal content duties. In the case of children, we can safely anticipate that content of this kind will be named primary priority content. Additionally, if such content is against the terms of service of a regulated company, it can be held responsible to those terms. It will have to provide a user empowerment tool on category 1 services so that it can be toggled out if an adult user wishes. That is my understanding of where this content has already been dealt with in the Bill. To my mind, this leaves the following ways in which suicide and self-harm material, which is the subject of this group of amendments, is not covered by the Bill. That is what I would like the Minister to confirm, and I absolutely stand by to be corrected.

In the case of adults, if self-harm and suicide material does not meet a bar of illegal content and the service is not category 1, there is no mechanism to toggle it out. Ofcom has no power to require a service to ensure tools to toggle self-harm and suicide material out by default. This means that self-harm and suicide material can be as prevalent as they like—pushed, promoted and recommended, as I have just explained—if it is not contrary to the terms of service, so long as it does not reach the bar of illegal content.

Search services are not subject to these clauses— I am unsure about that. In the case of both children and adults, if self-harm and suicide material is on blogs or services with limited functionality, it is out of scope of the Bill and there is absolutely nothing Ofcom can do. For non-category 1 services—the majority of services which claim that an insignificant number of children access their site and thus that they do not have to comply with the child safety duties—there are no protections for a child against this content.

I put it like that because I believe that each of the statements I just made could have been fixed by amendments already discussed during the past six days in Committee. We are currently planning to leave many children without the protection of the safety duties, to leave vulnerable adults without even the cover of default protections against material that has absolutely no public interest and to leave companies to decide whether to promote or use this material to fuel user engagement—even if it costs well-being and lives.

I ask the Minister to let me know if I have misunderstood, but I think it is really quite useful to see what is left once the protections are in place, rather than always concentrating on the protections themselves.

My Lords, I support the noble Baroness, Lady Finlay of Llandaff, in her Amendment 96 and others in this group. The internet is fuelling an epidemic of self-harm, often leading to suicide among young people. Thanks to the noble Baroness, Lady Kidron, I have listened to many grieving families explaining the impact that social media had on their beloved children. Content that includes providing detailed instructions for methods of suicide or challenges or pacts that seek agreement to undertake mutual acts of suicide or deliberate self-injury must be curtailed, or platforms must be made to warn and protect vulnerable adults.

I recognise that the Government acknowledge the problem and have attempted to tackle it in the Bill with the new offence of encouraging or assisting serious self-harm and suicide and by listing it as priority illegal content. But I agree with charities such as Samaritans, which says that the Government are taking a partial approach by not accepting this group of amendments. Samaritans considers that the types of suicide and self-harm content that is legal but unequivocally harmful includes information, depictions, instructions and advice on methods of self-harm or suicide, content that portrays self-harm and suicide as positive or desirable and graphic descriptions or depictions of self-harm and suicide.

With the removal of regulation of legal but harmful content, much suicide and self-harm content can remain easily available, and platforms will not even need to consider the risk that such content could pose to adult users. These amendments aim to ensure that harmful self-harm and suicide content is addressed across all platforms and search services, regardless of their functionality or reach, and, importantly, for all persons regardless of age.

In 2017 an inquiry into suicides of young people found suicide-related internet use in 26% of deaths in under-20s and 13% of deaths in 20 to 24 year-olds. Three-quarters of people who took part in Samaritans’ research with Swansea University said that they had harmed themselves more severely after viewing self-harm content online, as the noble Baroness, Lady Finlay, pointed out. People of all ages can be susceptible to harm from this dangerous content. There is shocking evidence that between 2011 and 2015, 151 patients who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm, and 82% of those patients were over 25.

Suicide is complex and rarely caused by one thing. However, there is strong evidence of associations between financial difficulties, mental health and suicide. People on the lowest incomes have a higher suicide risk than those who are wealthier, and people on lower incomes are also the most affected by rising prices and other types of financial hardship. In January and February this year the Samaritans saw the highest percentage of first-time phone callers concerned about finance or unemployment—almost one in 10 calls for help in February. With the cost of living crisis and growing pressure on adults to cope with stress, it is imperative that the Government urgently bring in these amendments to help protect all ages from harmful suicide and self-harm content by putting a duty on providers of user-to-user services to properly manage such content.

A more comprehensive online safety regime for all ages will also increase protections for children, as research has shown that age verification and restrictions across social media and online platforms are easily bypassed by them. As the Bill currently stands, there is a two-tier approach to safety which can still mean that children may circumnavigate safety controls and find this harmful suicide and self-harm content.

Finally, user empowerment duties that we debated earlier are no substitute for regulation of access to dangerous suicide and self-harm online content through the law that these amendments seek to achieve.

My Lords, I thank the noble Baroness, Lady Finlay, for introducing the amendments in the way she did. I think that what she has done, and what this whole debate has done, is to ask the question that the noble Baroness, Lady Kidron, posed: we do not know yet quite where the gaps are until we see what the Government have in mind in terms of the promised new offence. But it seems pretty clear that something along the lines of what has been proposed in this debate needs to be set out as well.

One of the most moving aspects of being part of the original Joint Committee on the draft Bill was the experience of listening to Ian Russell and the understanding, which I had not come across previously, of the sheer scale of the kind of material that has been the subject of this debate on suicide and self-harm encouragement. We need to find an effective way of dealing with it and I entirely take my noble friend’s point that this needs a combination of protectiveness and support. I think the combination of these amendments is designed to do precisely that and to learn from experience through having the advisory committee as well.

It is clear that, by itself, user empowerment is just not going to be enough in all of this. I think that is the bottom line for all of us. We need to go much further, and we owe a debt to the noble Baroness, Lady Finlay, for raising these issues and to the Samaritans for campaigning on this subject. I am just sorry that my noble friend Lady Tyler cannot be here because she is a signatory to a number of the amendments and feels very strongly about these issues as well.

I do not think I need to unpack a great deal of the points that have been made. We know that suicide is a leading cause of death in males under 50 and females under 35 in the UK. We know that so many of the deaths are internet-related and we need to find effective methods of dealing with this. These are meant to be practical steps.

I take the point of the noble Baroness, Lady Fox, not only that it is a social problem of some magnitude but that the question of definitions is important. I thought she strayed well beyond where I thought the definition of “self-harm” actually came. But one could discuss that. I thought the noble Baroness, Lady Kidron, saying that we want good law, not relying on good people, was about definitions. We cannot just leave it to the discretion of an individual, however good they may be, moderating on a social media platform.

Along with the Samaritans, I very much regret that we no longer have the legal but harmful category, which would help guide us in this area. I share its view that we need to protect people of all ages from all extremely dangerous suicide and self-harm content on large and small platforms. I think this is a way of doing it. The type of content can be perhaps more tightly defined in terms of the kinds of information, depictions, instructions and the kinds of content, the portrayal and the graphic descriptions that occur. One perhaps might be able to do more in that direction. I very much hope that we can move further today with some assurance from the Minister in this area.

The establishment of a specific unit within Ofcom, which was the subject of the Private Member’s Bill of the noble Baroness, Lady Finlay, is potentially a very useful addition to the Bill. I very much hope that the Minister takes that on board as well.

This has been a very good debate indeed. I have good days and bad days in Committee. Good days are when I feel that the Bill is going to make a difference and things are going to improve and the sun will shine. Bad days are a bit like today, where we have had a couple of groups, and this is one of them, where I am a bit worried about where we are and whether we have enough—I was going to use that terrible word “ammunition” but I do not mean that—of the powers that are necessary in the right place and with the right focus to get us through some of the very difficult questions that come in. I know that bad cases make bad law, but they can also illustrate why the law is not good enough. As the noble Baroness, Lady Kidron, was saying, this is possibly one of the areas we are in.

The speeches in the debate have made the case well and I do not need to go back over it. We have got ourselves into a situation where we want to reduce harm that we see around but do not want to impact freedom of expression. Both of those are so important and we have to hold on to them, but we find ourselves struggling. What do we do about that? We think through what we will end up with this Bill on the statute book and the codes of practice through it. This looks as though it is heading towards the question of whether the terms of service that will be in place will be sufficient and able to restrict the harms we will see affecting people who should not be affected by them. But I recognise that the freedom of expression arguments have won the day and we have to live with that.

The noble Baroness, Lady Kidron, mentioned the riskiness of the smaller sites—categories 2A and 2B and the ones that are not even going to be categorised as high as that. Why are we leaving those to cause the damage that they are? There is something not working here in the structure of the Bill and I hope the Minister will be able to provide some information on that when he comes to speak.

Obviously, if we could find a way of expressing the issues that are raised by the measures in these amendments as being illegal in the real world, they would be illegal online as well. That would at least be a solution that we could rely on. Whether it could be policed and serviced is another matter, but it certainly would be there. But we are probably not going to get there, are we? I am not looking at the Minister in any hope but he has a slight downward turn to his lips. I am not sure about this.

How can we approach a legal but harmful issue with the sort of sensitivity that does not make us feel that we have reduced people’s ability to cope with these issues and to engage with them in an adult way? I do not have an answer to that.

Is this another amplification issue or is it deeper and worse than that? Is this just the internet because of its ability to focus on things to keep people engaged, to make people stay online when they should not, to make them reach out and receive material that they ought not to get in a properly regulated world? Is it something that we can deal with because we have a sense of what is moral and appropriate and want to act because society wants us to do it? I do not have a solution to that, and I am interested to hear what the Minister will say, but I think it is something we will need to come back to.

My Lords, like everyone who spoke, I and the Government recognise the tragic consequences of suicide and self-harm, and how so many lives and families have been devastated by it. I am grateful to the noble Baroness and all noble Lords, as well as the bereaved families who campaigned so bravely and for so long to spare others that heartache and to create a safer online environment for everyone. I am grateful to the noble Baroness, Lady Finlay of Llandaff, who raised these issues in her Private Member’s Bill, on which we had exchanges. My noble friend Lady Morgan is right to raise the case of Frankie Thomas and her parents, and to call that to mind as we debate these issues.

Amendments 96 and 296, tabled by the noble Baroness, Lady Finlay, would, in effect, reintroduce the former adult safety duties whereby category 1 companies were required to assess the risk of harm associated with legal content accessed by adults, and to set and enforce terms of service in relation to it. As noble Lords will know, those duties were removed in another place after extensive consideration. Those provisions risked creating incentives for the excessive removal of legal content, which would unduly interfere with adults’ free expression.

However, the new transparency, accountability and freedom of expression duties in Part 4, combined with the illegal and child safety duties in Part 3, will provide a robust approach that will hold companies to account for the way they deal with this content. Under the Part 4 duties, category 1 services will need to have appropriate systems and processes in place to deal with content or activity that is banned or restricted by their terms of service.

Many platforms—such as Twitter, Facebook and TikTok, which the noble Baroness raised—say in their terms of service that they restrict suicide and self-harm content, but they do not always enforce these policies effectively. The Bill will require category 1 companies—the largest platforms—fully to enforce their terms of service for this content, which will be a significant improvement for users’ safety. Where companies allow this content, the user-empowerment duties will give adults tools to limit their exposure to it, if they wish to do so.

The noble Baroness is right to raise the issue of algorithms. As the noble Lord, Lord Stevenson, said, amplification lies at the heart of many cases. The Bill will require providers specifically to consider as part of their risk assessments how algorithms could affect children’s and adults’ exposure to illegal content, and content that is harmful to children, on their services. Providers will need to take steps to mitigate and effectively manage any risks, and to consider the design of functionalities, algorithms and other features to meet the illegal content and child safety duties in the Bill.

Following our earlier discussion, we were going to have a response on super-complaints. I am curious to understand whether we had a pattern of complaints—such as those the noble Baroness, Lady Kidron, and others received—about a platform saying, under its terms of service, that it would remove suicide and self-harm content but failing to do so. Does the Minister think that is precisely the kind of thing that could be substantive material for an organisation to bring as a super-complaint to Ofcom?

My initial response is, yes, I think so, but it is the role of Ofcom to look at whether those terms of service are enforced and to act on behalf of internet users. The noble Lord is right to point to the complexity of some marginal cases with which companies have to deal, but the whole framework of the Bill is to make sure that terms of service are being enforced. If they are not, people can turn to Ofcom.

I am sorry to enter the fray again on complaints, but how will anyone know that they have failed in this way if there is no complaints system?

I refer to the meeting my noble friend Lord Camrose offered; we will be able to go through and unpick the issues raised in that group of amendments, rather than looping back to that debate now.

The Minister is going through the structure of the Bill and saying that what is in it is adequate to prevent the kinds of harms to vulnerable adults that we talked about during this debate. Essentially, it is a combination of adherence to terms of service and user-empowerment tools. Is he saying that those two aspects are adequate to prevent the kinds of harms we have talked about?

Yes, they are—with the addition of what I am coming to. In addition to the duty for companies to consider the role of algorithms, which I talked about, Ofcom will have a range of powers at its disposal to help it assess whether providers are fulfilling their duties, including the power to require information from providers about the operation of their algorithms. The regulator will be able to hold senior executives criminally liable if they fail to ensure that their company is providing Ofcom with the information it requests.

However, we must not restrict users’ right to see legal content and speech. These amendments would prescribe specific approaches for companies’ treatment of legal content accessed by adults, which would give the Government undue influence in choosing, on adult users’ behalf, what content they see—

I wanted to give the Minister time to get on to this. Can we now drill down a little on the terms of service issue? If the noble Baroness, Lady Kidron, is right, are we talking about terms of service having the sort of power the Government suggest in cases where they are category 1 and category 2A but not search? There will be a limit, but an awful lot of other bodies about which we are concerned will not fall into that situation.

Also, I thought we had established, much to our regret, that the terms of service were what they were, and that Ofcom’s powers—I paraphrase to make the point—were those of exposure and transparency, not setting minimum standards. But even if we are talking only about the very large and far-reaching companies, should there not be a power somewhere to engage with that, with a view getting that redress, if the terms of service do not specify it?

The Bill will ensure that companies adhere to their terms of service. If they choose to allow content that is legal but harmful on their services and they tell people that beforehand—and adults are able and empowered to decide what they see online, with the protections of the triple shield—we think that that strikes the right balance. This is at the heart of the whole “legal but harmful” debate in another place, and it is clearly reflected throughout the approach in the Bill and in my responses to all of these groups of amendments. But there are duties to tackle illegal content and to make sure that people know the terms of service for the sites they choose to interact with. If they feel that they are not being adhered to—as they currently are not in relation to suicide and self-harm content on many of the services—users will have the recourse of the regulator to turn to.

I think that noble Lords are racing ahead a little bit in being pessimistic about the work of Ofcom, which will be proactive in its supervisory role. That is a big difference from the status quo, in terms of the protection for users. We want to strike the right balance to make sure that we are enforcing terms of service while protecting against the arbitrary removal of legal content, and the Bill provides companies with discretion about how to treat that sort of content, as accessed by their users. However, we agree that, by its nature, this type of content can be very damaging, particularly for vulnerable young people, which is why the Government remain committed to introducing a new criminal offence of content that encourages or promotes serious self-harm. The new offence will apply to all victims, children as well as adults, and will be debated once it is tabled; we will explore these details a bit more then. The new law will sit alongside companies’ requirements to tackle illegal suicide content, including material that encourages or assists suicide under the terms of the Suicide Act 1961.

The noble Baronesses, Lady Finlay and Lady Kidron, asked about smaller websites and fora. We are concerned about the widespread availability of content online which promotes or advertises methods of suicide and self-harm, and which can easily be accessed by people who are young or vulnerable. Where suicide and self-harm websites host user-generated content, they will be in scope of the Bill. Those sites will need proactively to prevent users being exposed to priority illegal content, including content that encourages or assists suicide, as set out in the 1961 Act.

The noble Baroness asked about the metaverse, which is in scope of the Bill as a user-to-user service. The approach of the Bill is to try to remain technology neutral.

I will plant a flag in reference to the new offences, which I know we will come back to again. It is always helpful to look at real-world examples. There is a lot of meme-based self-harm content. Two examples are the Tide Pods challenge—the eating of detergent capsules—and choking games, both of which have been very common and widespread. It would be helpful, ahead of our debate on the new offences, to understand whether they are below or above the threshold of serious self-harm and what the Government’s intention is. There are arguments both ways: obviously, criminalising children for being foolish carries certain consequences, but we also want to stop the spread of the content. So, when we come to that offence, it would be helpful if the Minister could use specific examples, such as the meme-based self-harm content, which is quite common.

I thank the noble Lord for the advance notice to think about that; it is helpful. It is difficult to talk in general terms about this issue, so, if I can, I will give examples that do, and do not, meet the threshold.

The Bill goes even further for children than it does for adults. In addition to the protections from illegal material, the Government have indicated, as I said, that we plan to designate content promoting suicide, self-harm or eating disorders as categories of primary priority content. That means that providers will need to put in place systems designed to prevent children of any age encountering this type of content. Providers will also need specifically to assess the risk of children encountering it. Platforms will no longer be able to recommend such material to children through harmful algorithms. If they do, Ofcom will hold them accountable and will take enforcement action if they break their promises.

It is right that the Bill takes a different approach for children than for adults, but it does not mean that the Bill does not recognise that young adults are at risk or that it does not have protections for them. My noble friend Lady Morgan was right to raise the issue of young adults once they turn 18. The triple shield of protection in the Bill will significantly improve the status quo by protecting adults, including young adults, from illegal suicide content and legal suicide or self-harm content that is prohibited in major platforms’ terms and conditions. Platforms also have strong commercial incentives, as we discussed in previous groups, to address harmful content that the majority of their users do not want to see, such as legal suicide, eating disorder or self-harm content. That is why they currently claim to prohibit it in their terms and conditions, and why we want to make sure that those terms and conditions are transparently and accountably enforced. So, while I sympathise with the intention from the noble Baroness, Lady Finlay, her amendments raise some wider concerns about mandating how providers should deal with legal material, which would interfere with the careful balance the Bill seeks to strike in ensuring that users are safer online without compromising their right to free expression.

The noble Baroness’s Amendment 240, alongside Amendment 225 in the name of the noble Lord, Lord Stevenson, would place new duties on Ofcom in relation to suicide and self-harm content. The Bill already has provisions to provide Ofcom with broad and effective information-gathering powers to understand how this content affects users and how providers are dealing with it. For example, under Clause 147, Ofcom can already publish reports about suicide and self-harm content, and Clauses 68 and 69 empower Ofcom to require the largest providers to publish annual transparency reports.

Ofcom may require those reports to include information on the systems and processes that providers use to deal with illegal suicide or self-harm content, with content that is harmful to children, or with content which providers’ own terms of service prohibit. Those measures sit alongside Ofcom’s extensive information-gathering powers. It will have the ability to access the information it needs to understand how companies are fulfilling their duties, particularly in taking action against this type of content. Furthermore, the Bill is designed to provide Ofcom with the flexibility it needs to respond to harms—including in the areas of suicide, self-harm and eating disorders—as they develop over time, in the way that the noble Baroness envisaged in her remarks about the metaverse and new emerging threats. So we are confident that these provisions will enable Ofcom to assess this type of content and ensure that platforms deal with it appropriately. I hope that this has provided the sufficient reassurance to the noble Baroness for her not to move her amendment.

I asked a number of questions on specific scenarios. If the Minister cannot answer them straight away, perhaps he could write to me. They all rather called for “yes/no” answers.

The noble Baroness threw me off with her subsequent question. She was broadly right, but I will write to her after I refresh my memory about what she said when I look at the Official Report.

My Lords, I am extremely grateful to everyone who has contributed to this debate. It has been a very rich debate, full of information; my notes have become extensive during it.

There are a few things that I would like to know more about: for example, how self-harm, which has been mentioned by the Minister, is being defined, given the debate we have had about how to define self-harm. I thought of self-harm as something that does lasting and potentially life-threatening damage. There are an awful lot of things that people do to themselves that others might not like them doing but that do not fall into that category. However, the point about suicide and serious self-harm is that when you are dead, that is irreversible. You cannot talk about healing, because the person has now disposed of their life, one way or another.

I am really grateful to the noble Baroness, Lady Healy, for highlighting how complex suicide is. Of course, one of the dangers with all that is on the internet is that the impulsive person gets caught up rapidly, so what would have been a short thought becomes an overwhelming action leading to their death.

Having listened to the previous debate, I certainly do not understand how Ofcom can have the flexibility to really know what is happening and how the terms of service are being implemented without a complaints system. I echo the really important phrase from the noble Lord, Lord Stevenson of Balmacara: if it is illegal in the real world, why are we leaving it on the internet?

Many times during our debates, the noble Baroness, Lady Kidron, has pushed safety by design. In many other things, we have defaults. My amendments were not trying to provide censorship but simply trying to provide a default, a safety stop, to stop things escalating, because we know that they are escalating at the moment. The noble Lord, Lord Stevenson of Balmacara, asked whether it was an amplification or a reach issue. I add, “or is it both?”. From all the evidence we have before us, it appears to be.

I am very grateful to the noble Lord, Lord Clement-Jones, for pressing that we must learn from experience and that user empowerment to switch off simply does not go far enough: people who are searching for this and already have suicidal ideation will not switch it off because they have started searching. There is no way that could be viewed as a safety feature in the Bill, and it concerns me.

Although I will withdraw my amendment today, of course, I really feel that we will have to return to this on Report. I would very much appreciate the wisdom of other noble Lords who know far more about working on the internet and all the other aspects than I do. I am begging for assistance in trying to get the amendments right. If not, the catalogue of deaths will mount up. This is literally a once-in-a-lifetime opportunity. For the moment, I beg leave to withdraw.

Amendment 96 withdrawn.

Clause 36: Codes of practice about duties

Amendment 96A not moved.

Amendment 97

Moved by

97: Clause 36, page 36, line 42, at end insert “including a code of practice describing measures for the purpose of compliance with the relevant duties so far as relating to violence against women and girls.”

Member’s explanatory statement

This amendment would impose an express obligation on OFCOM to issue a code of practice on violence against women and girls rather than leaving it to OFCOM’s discretion. This would ensure that Part 3 providers recognise the many manifestations of online violence, including illegal content, that disproportionately affect women and girls.

My Lords, it is a great pleasure to move Amendment 97 and speak to Amendment 304, both standing in my name and supported by the noble Baroness, Lady Kidron, the right reverend Prelate the Bishop of Gloucester and the noble Lord, Lord Knight of Weymouth. I am very grateful for their support. I look forward to hearing the arguments by the noble Lord, Lord Stevenson, for Amendment 104 as well, which run in a similar vein.

These amendments are also supported by the Domestic Abuse Commissioner, the Revenge Porn Helpline, BT, EE and more than 100,000 UK citizens who have signed End Violence Against Women’s petition urging the Government to better protect women and girls in the Bill.

I am also very grateful to the noble Baroness, Lady Foster of Aghadrumsee—I know I pronounced that incorrectly—the very distinguished former Northern Ireland politician. She cannot be here to speak today in favour of the amendment but asked me to put on record her support for it.

I also offer my gratitude to the End Violence Against Women Coalition, Glitch, Refuge, Carnegie UK, NSPCC, 5Rights, Professor Clare McGlynn and Professor Lorna Woods. Between them all, they created the draft violence against women and girls code of practice many months ago, proving that a VAWG code of practice is not only necessary but absolutely deliverable.

Much has already been said on this, both here and outside the Chamber. In the time available, I will focus my case for these amendments on two very specific points. The first is why VAWG, violence against women and girls, should have a specific code of practice legislated for it, rather than other content we might debate. The second is what having a code of practice means in relation to the management of that content.

Ofcom has already published masses of research showing that abuse online is gendered. The Government’s own fact sheet, sent to us before these debates, said that women and girls experience disproportionate levels of abuse online. They experience a vast array of abuse online because of their gender, including cyberflashing, harassment, rape threats and stalking. As we have already heard and will continue to hear in these debates, some of those offences and abuse reach a criminal threshold and some do not. That is at the heart of this debate.

The first death threat that I received—I have received a number, sadly, both to me and to my family—did not talk about death or dying. It said that I was going to be “Jo Coxed”. Of course, I reported that to Twitter and the AI content moderator. Because it did not have those words in it, it was not deemed to be a threat. It was not until I could speak to a human being—in this case, the UK public affairs manager of Twitter, to whom I am very grateful—that it even started to be taken seriously.

The fear of being harassed is impacting women’s freedom of speech. The Fawcett Society has found that 73% of female MPs, versus 51% of male MPs, say that they avoid speaking online in certain discussions because of fear of the consequences of doing so. Other women in the public eye, such as the presenter Karen Carney, have also been driven offline due to gendered abuse.

Here is the thing I cannot reconcile with the government response on this so far. This Government have absolutely rightly recognised that violence against women and girls is a national threat. They have made it a part of the strategic policing requirement. If tackling online abuse against women and girls is a priority, as the Government say, and if, as in the stated manifesto commitment of 2019, they want the UK to be

“the safest place in the world to be online”,

why are the words “women and girls” not used once in the 262 pages of the current draft of the Bill?

The Minister has said that changes have been made in the other House on the Bill—I understand that—and that it is now focused more on the protection of children in relation to certain content, whereas adults are deemed to be able to choose more what they see and how they see it. But there is a G in VAWG, for girls. The code of practice that we are talking about would benefit that very group of people—young girls, who are children—whom the Government have said that they really want to protect through the Bill.

Online harassment does not affect only women in the public eye but all women. I suspect that we all now know the statistic that women are 27 times more likely to be harassed online than men. In other words, to have an online presence as a woman is to expect harassment. That is not to say that men do not face abuse online, but a lot of the online abuse is deliberately gendered and is targeted at women. Do men receive rape threats on the same vast scale as women and young girls?

It should not be the public’s job to force a platform to act on the harmful content that it is hosting, just as it should not be a woman’s job to limit her online presence to prevent harassment. But the sad reality is that, in its current form, the Bill is doing very little to force platforms to act holistically in relation to violence against women and girls and to change their culture online.

The new VAWG-relevant criminal offences listed in the Bill—I know that my noble friend the Minister will rely on these in his response to the debate—including cyberflashing and coercive and controlling behaviour, are an important step, but even these new offences have their own issues, which I suspect we will come on to debate in the next day of Committee: for example, cyberflashing being motive-based instead of consent-based. Requiring only those platforms caught by the Bill to look at the criminal offences individually ignores the rest of the spectrum of gendered abuse.

Likewise, the gender-neutral approach in the Bill will harm children. NSPCC research found that in 2021-22, four in five victims of online grooming offences were girls. The Internet Watch Foundation, an organisation we are going to talk about in the next group, has found in recently published statistics that girls are more likely to be seriously abused online. I have already stated that this is not to say that boys and men do not experience abuse online, but the fact is that women and girls are several times more likely to be abused. This is not an argument against free speech; people online should be allowed to debate and disagree with each other, but discussions can and should be had without the threat of rape or harassment.

Again, the Government will argue that the triple-shield approach to combating legal but harmful content online will sufficiently protect women and girls, but this is not the case. Instead of removing this content, the Bill’s user empowerment tools—much debated already—expect women to shield themselves from seeing it. All this does is allow misogynistic and often violent conversations to continue without women knowing about them, the result of which can be extremely dangerous. A victim of domestic abuse could indeed block the user threatening to kill them, but that does not stop that user from continuing to post the threats he is making, or even posting photos of the victim’s front door. Instead of protecting the victim, these tools potentially leave them even more vulnerable to real-life harms. Likewise, the triple shield will rely too heavily on platforms setting their own terms and conditions. We have just heard my noble friend the Minister using this argument in the last group, but the issue is that the platforms can choose to water down their terms and conditions, and Ofcom is then left without recourse.

I turn to what a violence against women and girls code of practice would mean. It could greatly reduce all the dangers I have just laid out. It would ensure that services regularly review their algorithms to stop misogyny going viral, and that moderators are taught, for example, how to recognise different forms of online violence against women and girls, including forms of tech abuse. Ofcom has described codes of practice as

“key documents, which set out the steps services can take to comply with their duties”.

Services can choose to take an alternative approach to complying with their duties, provided that it is consistent with the duties in the Bill, but codes will provide