Skip to main content

Social Media: Deaths of Children

Volume 817: debated on Thursday 20 January 2022

Question for Short Debate

Asked by

To ask Her Majesty’s Government what assessment they have made of the role played by social media in the deaths of children in the United Kingdom, including by suicide, self-harm and murder.

I declare my interests, particularly as chair of 5Rights and as a member of the Joint Committee on the Draft Online Safety Bill.

My Lords, many of you will have read reports of how, in 2017, 14 year-old Molly Russell took her own life after being bombarded by self-harm and pro-suicide images. In the days after her death, her father Ian tried to access her phone simply to try to understand what had happened to his daughter. The notes from his diary from that time make for grim reading. The woman at the so-called genius bar in the Apple store “could not help”. The promised follow-up call failed to materialise—despite Ian sitting grief-struck, pen in hand, waiting at the appointed hour. Even after he finally found a person enabled to deal with him, they were only allowed to send a template information request form by email, which required a great deal of information from Ian but did not result in him receiving the information he requested. Apple has never helped Ian to access Molly’s phone, and without the assistance—indeed, the persistence—of the coroner and the police, the data it contained would not be available to Molly’s inquest, which is still investigating the contributory causes to Molly’s death four years later.

Judy and Andy Thomas struggled similarly after the suicide of their 15 year-old daughter Frankie, unable to get anything more than an automated response. Their letters to Instagram’s CEO Adam Mosseri, copied to the European headquarters, went unanswered. It was only after a year of desperate letter writing to anyone who might help that I was able to arrange a call on their behalf, only for them to hear that they were not going to get the information they wanted. During Frankie’s inquest, despite evidence that her suicide was highly influenced by what she had seen online, Wattpad refused to disclose full details of Frankie’s activity on its platform, even while confirming that self-harm and suicide stories on its site should be rated mature and should not have been accessible to a user registered as a child.

Olly Stephens, who was 13 when he was murdered, had repeated problems online. He was groomed by a wannabe county lines gang, extorted by a group who stole his bike and, finally, lured to a park where he was killed, the murder having been organised online. His father Stuart says that in the hours immediately after his murder, Olly’s mother and sister had to trawl through social media sites to get evidence because they were aware that they would never get it from the tech companies.

When a child dies, parents are asked to clear out the school locker: they inherit the artefacts of a child’s life. If the authorities have access to information that may shed light on the circumstances of their death, it is shared as a matter of course—but not if that information is online. The argument made by the tech sector is that it is protecting other users, but that does not account for parents’ need for closure and evidence necessary for police and coroners, and it conveniently obscures the role of the tech companies themselves as they continue to recommend harmful material and facilitate violent abuse to other children.

In the other place two days ago, Ian Paisley MP introduced a 10-minute rule Bill to grant next of kin the right to access a smartphone and other digital devices of a person upon their death or incapacity. He made the important point that much precious material, both sentimental and material to understanding what happened, is withheld from the next of kin simply because people—particularly the young—do not think to leave a password in their will. Indeed, it is unlikely any child would even have a will. He also pointed out that access was eminently possible: in the US some states have brought in legislation, such as the Revised Uniform Fiduciary Access to Digital Assets Act, to retrieve financial assets. Once again, money trumps child safety.

The Joint Committee made two recommendations on this issue: that the Government should consult on how terms and conditions of online services can be reformed, by law, to give bereaved parents access to data; and that Ofcom, the ICO and the Chief Coroner should review the powers of coroners to ensure that they have unfettered access to digital data, including data recommended to children by tech companies, following the death of a child—and that both of those should happen before the Bill reaches Royal Assent.

I ask the Minister to put on record today that the draft Bill will be amended so that other families do not suffer as the Russell, Thomas and Stephens families have done. We cannot bring their children back, but we can create a lasting legacy for their extraordinary courage in speaking out.

The purpose of today’s debate is not only to secure justice for bereaved families, but to highlight steps that should be taken to prevent tragedy. Sitting on the Secretary of State’s desk is a comprehensive set of recommendations from the Joint Committee that would fundamentally change how the sector treats children. They are: mandatory safety by design to scale back harmful algorithms, design features and business practices; a binding child safety code that sets out risks and mitigations in accordance with the Convention on the Rights of the Child; alignment with the age-appropriate design code to make sure the Bill applies to all services likely to be accessed by children, so that there is nowhere to hide; mandatory cross-platform co-operation, so that risks known by one service are routinely shared with others; statutory codes for moderation and complaints, to ensure that swift action is taken before tragedy strikes; and a regulatory focus on risk rather than size. Again and again we see that small is not safe. I refer back to the content Frankie saw on Wattpad, a service that many of you will never have heard of.

There should also, of course, be the immediate introduction of age assurance, without which we will fail to deliver any of the protections that I have set out. This list is neither aspirational nor nice to have: these are essential and interdependent elements of a proportionate and enforceable regime to make our children safe. All other business sectors apply rules of product safety, and it is tragic that it has taken the death of children to give urgency to our calls for regulation.

TikTok, Meta, Apple and Alphabet are among the most valuable and profitable companies in the world, and the tech sector is now alone responsible for 25% of global GDP. But these same companies are algorithmically promoting and spreading material that nudges children into states of despair; priming kids into gambling habits with reward features that induce dopamine hits, which cause addiction; granting unfettered access to age-restricted spaces; fuelling an epidemic of eating disorders, self-harm and radicalisation; and systematically hiding the evidence. Even in a world focused on the balance sheet of loss and profit, children’s lives should not be the collateral damage of the tech sector. It is time to bring that to a halt—and halt it we can.

The Joint Committee recommendations have unprecedented support across the political spectrum, as they do across civil society. All that is required is for the Government to act. I ask the Minister, when he answers, to acknowledge that failure to have these things in place is costing children their lives—and I ask for a commitment to all the Joint Committee’s recommendations that relate to children. This is a time not for cherry-picking headline-grabbing changes, but rather for setting out an enforceable product safety regime that will keep our children safe.

Given the tech companies’ determined efforts to frustrate basic child safety requirements, I ask the Minister again to explain to the Committee how the Government can justify delaying the introduction of age assurance. They have failed to implement Part 3 of the DEA and rejected my Private Member’s Bill for privacy-preserving age assurance, instead putting their faith in a voluntary scheme which their own officials estimate would take a minimum of two years and do nothing to impact on those who do not volunteer. This implicitly goes against statements made last week in the other place by the Minister for Digital that self-regulation has failed. If the Government acted today, Ofcom could set out expectations of age assurance by the end of the year, unleashing an arms race of innovation to meet those expectations. Failing to act means that more families will suffer heartbreak and more children harm.

In spite of my many years on this beat, Olly’s father Stuart shocked me to the core when he said that, since Olly’s death, he has received over 300 taunting and abusive messages via social media—images of people waving knives, celebrating Olly’s death and threatening his wife and daughter with rape, along with pictures identifying where they live. This sector does not have the authority or willingness to police itself. My deepest thanks go to those noble Lords who have chosen to speak; given our sad subject matter, I anticipate their words with trepidation.

My Lords, I will speak to one particular issue that the noble Baroness has raised, quite rightly in my opinion, in this debate and in the report of the Draft Online Safety Bill Joint Committee, of which I know she was a very active member. This is the question of access to data from the accounts of people who have sadly taken their own lives where there is a view that it may reveal something useful and important for their grieving relatives.

I do this as somebody who used to work for a social media platform and took part in the decision-making process on responding to requests for data in these tragic circumstances. In the internal debate, we had to weigh two potential harms against each other. It was obvious that refusing to disclose data would add to the pain and distress of grieving families, which the noble Baroness eloquently described for us, and, importantly, reduce opportunities for lessons to be learned from these awful situations. But there was also a fear that disclosing data might lead to other harms if it included sensitive information related to the connections of the person who had passed away.

The reluctance to disclose is sometimes described as being for “privacy reasons”. We should be more explicit; the concern in these cases is that, in trying to address one tragedy, we take an action that leads to further tragedy. The nightmare scenario for those discussing these issues within the companies is that another young person becomes so distressed by something that has been disclosed that they go on to harm themselves in turn. This genuine fear means that platforms will likely err on the side of non-disclosure as long as providing data is discretionary for them. If we want to solve this problem, we need to move to a system where disclosure is mandated in some form of legal order. I will briefly describe how this might work.

Families should not have to go directly to companies at a time of serious distress; they should instead be able to turn to a specialist unit within our court system which can assess their request and send disclosure orders to relevant companies. The noble Baroness eloquently described the problem we have with the status quo, where people approach companies directly. The platforms would then be required to provide data to the courts, which would need to be able to carry out two functions before making it available to families and coroners as appropriate.

First, they should be able to go through the data to identify whether there are particular sensitivities that might require them to withhold or effectively anonymise any of the content. To the extent possible, they should notify affected people and seek consent to the disclosure. In many cases, the platforms will have contact details for those individuals. Secondly, they must be able to consider any conflicts of law that might arise from disclosure, especially considering content related to individuals who may be protected by laws outside of the jurisdiction of the UK courts. This would need to include making decisions on content where consent has been withheld. If we could set up a structure such as this, we could have a workable regime that would work for all interested parties.

A few minutes is obviously not long enough to cover all these issues in detail, so I will publish a more comprehensive post on my blog, which is aptly named regulate.tech. I thank the noble Baroness for creating an opportunity to consider this important issue, one I am sure we will return to during the passage of the online safety Bill.

My Lords, I was looking for something original to say in this debate, so I went back to my previous existence as a Member of the European Parliament. One of the things that is still of great regret is that, in leaving Europe, we have left all the structures around it that can be helpful when we face problems like this. In particular, I think of the work of the EU directorate for health. In Europe, most countries face problems similar to ours and are trying to solve them. Overall in Spain, suicide among young people—defined by Spain as those aged 15 to 29, which probably goes a bit further than we would—is the second highest cause of death. Spain has put €100 million into a strategy to combat it, but it is doubtful whether it will do anything because, as mentioned by the noble Lord, Lord Allan, the key is getting access to the information. Italy is setting up an observatory, although it seems to be taking a long time. Even in Finland, which one thinks of as a very enlightened, Nordic country that deals with such issues, something like 25% of all suicides are in the age group from 15 to 24, so it is a problem that that country is also grappling with.

This is one of the great tragedies of leaving the EU. Although the EU has no formal responsibility, everyone will tell you that there are unofficial meetings of Health Ministers, where anything can be put on the agenda by any member state, so it is possible to exchange information. Have the Government gone to any effort to get information from other countries on how they are dealing with the issue, what their plans are, and whether they will publish that?

I have a couple of points from the briefing that I got. Among other things, it says that in a debate in the House of Commons, Chris Philp

“argued that they could ‘edit their algorithms tomorrow […] they should not be waiting for us to legislate; they should do the right thing today’.

Is there any sign of that right thing being done today? If so, it is certainly not recognised here. The briefing also said that

“Instagram said that it would ban graphic images of self-harm as part of a series of changes.”

Has it? Also, the online harms consultation says that the framework should include provisions to address suicide and self-harm. Has that been done?

Finally, can the Minister confirm in relation to suicide that all platforms and people of all ages will be in the scope of the final Bill when it is presented to the House? That is an important point. We need to go beyond just this group to the wider problem.

My Lords, it is a pleasure to follow the noble Lord, Lord Balfe, and I thoroughly agree with him that we have to go beyond this specific issue to the wider problem. I congratulate my noble friend Lady Kidron on keeping up the pressure on this incredibly important debate. I want to briefly mention two different aspects: one is about young girls and one is about young boys.

I have talked before about the sexual pressures on girls that happens online. I remember so well the anxieties of being a teenager, of trying to set up Spare Rib magazine and feeling immensely conflicted about trying to own your own sexuality and your own rights in the world, to have dignity and control, and to be able to say yes and indeed to be able to say no. Looking back, if I had been able to see the kind of pornography that is now available at a simple click, that would have been extremely hard. You are presented with streams and streams of apparently willing young women who agree to have sex with not so much as a dinner and a nice night out; what they enjoy is a semi-situation of rape, over and over. The women are almost always extremely thin, shaved, hairless, kind of perfect—almost doll-like. They are completely and utterly unreal and bear very little resemblance to what an average teenage girl is. While my noble friend Lady Kidron has spoken so movingly about girls who take their own lives, there are a lot of stages on the road to that which are about misery, dejection, unhappiness and shaming—a consciousness all the time that “I am not good enough.” Indeed, the entire advertising world—you see this hugely online—is predicated on the fact that you could be better. There is no such thing left as normal hair or a normal size. In every case, if you spend money, you will be better.

In my remaining minute and a half I will talk about what happens to young men. In particular, I want to talk about my friend Laura Bates, who wrote Everyday Sexism. She used a fake account to set herself up online as a boy. She said, “I am 15 and I’m having a tough time getting dates” and said that she had acne. To start with, there was advice about acne drugs, then a bit of advice about how to dress. Then the advice started to get a bit creepier: “Are the girls in your school being too uppity? Are they beating you in class?” Quite soon, in the course of a few weeks, Laura found herself on an incel website. The progression was just click by click. It preyed on every sort of suspicion that a young man might have that somehow women are doing better and that somehow their lot in life is not their responsibility but the result of what feminists have done.

It is not a question of arguing this or that; the point is that when you see how the end product of misogyny and incels is now “Kill the women”, that is incredibly dangerous. Right across the internet, young people are being drawn into ever more extreme points of view that bear very little relationship to their reality. It happens across sexuality and with young men. At all points the internet companies could stop this but in many instances they are just making money out of preying on people’s weakness and lack of self-respect. These are all really difficult things to talk about, and if, as has been happening in the pandemic, your online world becomes even more real than your real world, you have very little way to express those feelings and get some help.

My Lords, I too thank the noble Baroness, Lady Kidron, for securing this most important debate on the contributory role of social media to the deaths of children, and I pay tribute to her persistent campaigning on this subject. It is a timely debate given that only a month ago we received the legislative scrutiny committee’s report on the draft online harms Bill.

I want to focus on the whole question of the extent to which we understand the numbers and the causes of child deaths, not only where social media plays a significant role but in a whole range of other issues. This is a much broader problem than just this topic, although it is a superb example of why we need better research and better recording of data.

In December, your Lordships’ House debated the Second Reading of my Coroners (Determination of Suicide) Bill. It would require coroners to record any relevant contributory factors once a death by suicide has been officially determined. It would not be a finding in law, the results would be anonymised and published anonymously, and it would be akin to the well-established processes that hospitals have for recording comorbidities of death.

All sorts of groups are campaigning and looking for much better data. Your Lordships will know that the reason why I have brought forward the Coroners (Determination of Suicide) Bill is because we have been trying to get accurate stats on gambling-related suicides—many of them are of younger adults—which, according to the recent evidence from Public Health England, accounts for roughly 8% of all suicides. That is a really significant number of suicides. Regardless of the criticisms of my Bill on feasibility, there is an important principle here about how we record comorbidities and use that evidence. Again and again when campaigning against massively powerful industries, one argument is that we do not really have the statistics. I have to say that Her Majesty’s Government officially come back with the same argument again and again, so for the last five years, my question has been, “Please will you help us to start getting accurate stats?” That is why I turned up with the idea of a coroners Bill. It is absolutely crucial to get the accurate stats because, if we do not, we will never be able to devise strategies to reduce the number of suicides. You do not reduce suicides in general by saying nice and comforting things about it to people; you find out what the causes are and get a strategy to address each one. Particularly when we have something that causes 8% of deaths, we really need to collect that sort of evidence.

Of course the Government are legislating to prevent child exposure to some of the content that Molly Russell and others saw, but it is absolutely crucial that we get ways of trying to understand properly what is going on. The Government say that my Bill will not be an appropriate mechanism to collect the evidence. What it has led us to is discussions with a number of coroners about postvention studies, which may be how we can get hold of that data. However it is, we need it. Will the Minister tell the Committee specifically what Her Majesty’s Government are doing to try to get this data, rather than keeping saying, “Oh dear, we haven’t got it”? It is vital that it is collected, if we are to have an evidence-based approach to preventing suicides in relation to all associated risk factors.

My Lords, I too thank the noble Baroness, Lady Kidron, for securing this short debate, and pay tribute to the work she has done in this area. Her perseverance and tenacity are admirable.

There are many benefits of social media; it is a source of learning, advice and support for children and young people. But as we know there are many negatives too, as illustrated by the case of Molly Russell and other cases highlighted by the noble Baroness. There is also concern that harm caused by these platforms is exacerbated by the systems and processes of companies which amplify the spread of this dangerous content. Access is made easier by recommendations and algorithmically generated content suggestions. There is another concern that these companies then thwart the efforts of grieving parents to retrieve the data and information relating to their child’s online activity.

While quantifying the link between social media and children’s health is complicated, and it is argued that there is no definitive academic research, I agree with the House of Lords Science and Technology Select Committee that the absence of good academic evidence is not evidence that social media and screens have no effect on young people and children—they do. The Joint Committee on the Draft Online Safety Bill recently reported the evidence it has received linking self-harm and suicide attempts with accessing content online. There is therefore a strong case for taking action now, before the situation gets worse, and acting on the recommendations of the Joint Committee. It would be very short-sighted to lose the opportunity of including these in the online safety Bill.

I urge the Government to accept the Joint Committee’s recommendations to protect children as a comprehensive package. A statutory code on child online safety must be introduced, and the Bill should be extended to ensure that children are offered safety measures on all services likely to be accessed by children. Access to data by grieving parents in case of death should be included in the Bill.

Will the Government adopt in full the recommendations of the Joint Committee report and, if not, can the Minister please explain the reasons why not? Will the Government accept the proposal by the Law Commission to criminalise the encouragement of self-harm, threats of serious harm and stirring up hatred on grounds of sex, gender and disability? Can the Minister also confirm, in relation to suicide and self-harm content, that all platforms and people of all ages will be in scope of the final Bill presented to Parliament, and that age assurance will be part of the Bill? I look forward to the Minister’s response.

My Lords, I congratulate the noble Baroness, Lady Kidron, on securing this important debate. We are kindred spirits. Like the noble Baroness, I have long campaigned to try to make the internet safer for children, and I declare an interest as a vice-president of Barnardo’s, which is also deeply concerned about these issues.

Protecting children online must be an urgent priority for this House. The online safety Bill provides hope that the internet will be made safer for everyone, particularly for children and young people, but it will not come into force until 2024, so how are we going to protect children today, especially from violent, illegal sexual pornography? We need interim measures now, because even the DCMS research has shown that too many of our children have already been exposed to harmful content online, including violent pornography. Children themselves believe that they should be protected from harmful content online and on social media, saying that social media content often made them feel negatively about themselves, humiliated, threatened, embarrassed and, in some cases, was cause for self-harm, or even suicide. Research by Facebook found that 13% of teenagers in the UK said feelings of wanting to kill themselves had started on Instagram.

The online safety Bill is a once-in-a-generation chance to help to address these problems and finally make the internet safer for children, but we also need interim measures in place now to protect children online. If we wait for the online safety Bill, a whole generation of children and young people will have been left unprotected. The legislation has already been passed in Part 3 of the Digital Economy Act 2017, which provides some level of protection for children. Why not implement it? We have already wasted almost three years so far by not doing so. Think of the harm that has been done in that time.

Along with Barnardo’s and many other children’s charities, I have been calling for the Bill to include age verification for all dangerous pornography sites, but it needs to go further to make sure that it defines harmful but legal content that depicts self-harm and glorifies suicide, so that this, too, can be put on the virtual top shelf, behind a wall and out of sight of vulnerable children. It is also essential that children can access the support they need when they have been abused, bullied and exploited online, which can lead to suicidal thoughts. They desperately need mental health support teams in their schools, too. Social media platforms need to invest in awareness-raising, as well as signposting children who may be at risk of bullying or abuse to the support they need to recover and stay safe.

What research have the Government undertaken to understand the true impact of social media on vulnerable children’s mental health? Will they agree to meet prior to the publication of the online safety Bill with Barnardo’s and other children’s charities to discuss how to ensure that it protects children from preventable harm? Why will they not implement Part 3 of the Digital Economy Act as an interim measure to show that they truly care about children’s well-being?

My Lords, most deaths are sad and some are tragic, but a death from suicide is particularly devastating. It leaves the survivors with a question that remains on their minds for the rest of their lives: what went wrong, what more could I have done? The death of a young person from suicide is especially gut wrenching. How can anyone, let alone someone so young, find existence so unbearable that they choose to reject the precious gift of life? Self-harm, which sometimes leads to death, belongs in the same category.

I am very glad that the noble Baroness, Lady Kidron, has secured this debate before the online safety Bill comes before the House and has pursued this issue so tenaciously for so long. I support what she said about the need for parents of a deceased child to have access to their child’s digital data, which was reiterated in the 10-minute rule Bill by Ian Paisley. If it is possible to add further pain to parents whose child has committed suicide, it is by them not really knowing, or not knowing fully enough, why their child took their own life. This happens if, for example, they leave no farewell note. The parents whose child has committed suicide, partly as a result of what they have seen and heard on social media, may have some idea of what has happened, but they will want to understand as fully as possible and to have access to their child’s data.

I will not repeat what the noble Baroness and other people have said about the terrible difficulty at the moment of parents getting access to the data and the need to do something about it. As I say, this adds to the distress of the parents, who want to know more and try to understand. As has been mentioned, the failure to allow access means that we cannot learn from it to ensure that the same or similar material is not recommended to other children.

Further than this, I simply add my support to the other measures proposed by the noble Baroness. We know how serious the problem is. We have been told that there is no exact academic data, but the surveys we have indicate that there is a link in about 25% of suicides and cases of self-inflicted harm among young people. It is very difficult to doubt that link if one has seen some of that material.

As the noble Baronesses, Lady Boycott and Lady Benjamin, reiterated from their experience of this, it needs to be set within a wider problem. As they will know better than I do, there is a particular fragility among young people at the moment, partly because of Covid and partly because of the intense pressure of social media.

A voluntary code is not enough, of course, so with other noble Lords I look to the Minister to support the recommendations of the Joint Committee. We need clear, firm, enforceable legislation, which is essential to prevent the circulation of harmful content and to ensure that young people simply cannot have access to it.

My Lords, this is one of those debates in which one has a developing feeling of sympathy with the Minister who is going to reply. Unfortunately, that feeling does not always survive the Minister’s speech. We will see what happens today.

It is quite clear that the internet and the online world are something that we have not really wanted to address that fully in the past. The whole system has been a little stand-offish, finding that it is a little frightening and moves very quickly. Most of us are probably happier if somebody younger than us, often our children, explains how the damned thing works.

Having said that, it is quite clear now that, in the greatest traditions of legislation, there are problems that we must deal with. One of those problems is that people feel that they are in a space where they cannot be touched, where they can do what they like and indulge themselves, and where money-making activities and things such as this are not things that the rest of society can get at, and it is none of their business to get in there.

This was brought home to me by a family friend, and this relates directly to the last point made by the noble Baroness, Lady Kidron, in her opening remarks. I live in the village of Lambourn, in the “valley of the racehorse”, and one of my family friends down there is a trainer whose daughter is an up-and-coming jockey. He discovered that after a couple of bad rides his daughter was being threatened with being raped and killed. He was quite appalled when he discovered that this is regarded as fairly normal. The real problem started when he went to the police, who said, “Oh well, we’re not really going to do anything about it”.

We must have a structure where we can intervene on this, as we would on other occasions. That is one of the things that must come out of this. There are technical points—on algorithms, how you follow things through, and how the people running these sites generate interest, money or indulgence—but the fact of the matter is that we must have a very clear guide as to how we will get in and make sure that there is a price if people break any rules, laws or structures. This could be one of the restrictions. It would not stop everything, but it would give us a structure to go down.

We need knowledge from sites that suicide victims have used. We need to be able to get in and use this control, which we regard as normal everywhere else—we have laws, and if you break them, we enforce them. The police, of course, will need help with new structures. If, when the Minister replies, he could give us an idea of what this enforcement structure will be like and the Government’s thinking about it, he would have taken the first step. Not only do we need to have these structures, but people have to know about them and the fact that society has decided that we are not going to put up with this anymore. That is one of the key points here.

I will not try to follow my noble friend’s experience and technical detail on this, because I cannot. But the principle behind this must be that the Government take this seriously, act and state publicly that this happens. If they do not, we will carry on having this problem again and again, and people will not react and do something about it because it is easier just to hope that it goes away.

My Lords, we do indeed owe the noble Baroness, Lady Kidron, a big debt of thanks today for bringing forward this important debate, and particularly for the way in which she addressed the issues, giving us the stories and case histories of those tragedies. Too often, this is a very dry subject: we look at the numbers, for example in the Lords briefing on this, and do not think of the human stories and the impact beyond them. That is really important for us to focus on as legislators.

In part, the online harms agenda has come about because of social media-influenced behaviours, with suicide being principal among those. But despite much fanfare about the DCMS policy work, little has yet been delivered by the Government to make the internet a safer place for children and young people.

The briefing for this debate, which I referred to, sets out very worrying trends in relation to suicide and self-harm by young people. Although correlation does not always imply causation, the evidence gathered by a variety of charities and campaign groups strongly points to a negative role played by social media. Our personal experiences reinforce and tell us that, too. My kids grew up as the internet was expanding and growing, and they often relayed to me horror stories of experiences that they and their friends had had and the impact that it had upon them.

The implementation of the age-appropriate design code was a welcome step forward, but we have a long way to go on this, and I hope we can hear more from the Minister on that. As we know, significant gaps in the regulatory landscape remain, and it is not clear whether the Government will adopt all the recommendations made by the Joint Committee on the Draft Online Safety Bill, so perhaps the Minister could enlighten us a bit more on that. I know he cannot give us a full picture and pre-empt a future Queen’s Speech, but some hints would be very helpful.

Even if the Government choose to close loopholes and go further on statutory requirements, your Lordships’ House is unlikely to consider the Bill until late 2022, or possibly even 2023. As the noble Baroness, Lady Benjamin, said, the entry into force of key measures is unlikely before 2024. That delay is unacceptable, particularly at a time when young people are under great pressures—the impact of Covid, and the sense of delay to their lives and their personal development that the past two years have brought. We urgently need to bring forward more measures. I was encouraged to see that, in the Police, Crime, Sentencing and Courts Bill, the Government have at least adopted a position on online racist harm to footballers. That was a sign of good intent, but they need to build on it.

The noble Baroness, Lady Kidron, has been tireless on this issue, and has raised it many times. We should offer as much support as we can to that campaign. Perhaps the Minister will outline some of the Government’s commitments today. When can we expect some concrete actions—legislative or non-legislative—in the current parliamentary Session? That would be a good way forward.

Debates such as this are important, but what we really need is change. We as a party stand ready to engage with colleagues from all sides, and with the platforms themselves, to make the digital world a safer place for children and young people to be, but we need more than warm words from Ministers.

My Lords, I start by thanking the noble Baroness, Lady Kidron, for tabling this important debate, and for beginning by setting out the personal and often harrowing examples that should be uppermost in our minds as we debate these important issues. I am grateful, too, for her drawing to my attention the 10-minute rule Bill introduced in another place on Tuesday by Ian Paisley MP. I have read the transcript of his remarks, in addition to listening to the contributions of noble Lords today.

Her Majesty’s Government share the concerns raised by noble Lords today about the risks to children of harmful content and activity online, including in social media. Although many children have a positive experience online—it is important to remember that—it is clear that the presence of harmful material, and in particular content promoting suicide or self-harm, can have a serious impact on children’s mental health and well-being. The noble Baroness, Lady Boycott, was right to point to the fragility and vulnerability of young people, including adolescents and people well into their teens.

Sadly, we know from research, from coroners’ reports and from colleagues in the police that harmful online content, including that seen in social media, is playing an increasing role in individual suicides. In addition, figures from 2020 show that 40% of 12 to 15 year-olds who are concerned about and have experienced content promoting self-harm cite social media as the source. There is also evidence that gangs are using social media to promote gang culture, to taunt each other, and to incite violence.

The Government are determined to hold social media and other platforms to account for this harmful content, and to make the UK the safest place to be a child online. A key part of that is the online safety Bill, which, as noble Lords know, we published in draft last May. For the first time, under that Bill platforms will have a clear legal responsibility for keeping their users safe online. Platforms will have to understand the risk of harm to their users, and put systems and processes in place that improve their users’ safety.

All companies within the scope of the Bill will have to remove and limit the spread of illegal content, such as material that encourages or assists suicide, and take steps to prevent similar material from appearing. The largest tech companies will also be held to account for tackling activity and content harmful to adults who use their service. Under the new laws, those companies will have to set out clearly what content is acceptable on their platforms, and enforce their terms and conditions consistently. That will enable us to address many of the questions raised by my noble friend Lord Balfe, and to hold companies to account.

The strongest protections in the legislation will be for children. Services likely to be accessed by children will need to conduct a child safety risk assessment and provide safety measures for their child users against harmful and age-inappropriate content. Platforms likely to be accessed by children must consider the risks that children on their services face from priority harmful content—that will be set out in secondary legislation—and any other content they may identify that could cause harm to children. They will also need to consider the risk of harm from the design and operation of their systems.

We expect priority harms for children to include age-inappropriate material, such as pornography and extreme violence, and harmful material such as that which promotes eating disorders and self-harm, as well as cyberbullying. Ahead of designating the “priority harms”, which will be in scope of the legislation, the Government have commissioned research to build the evidence base on harms to children online. This research will review the prevalence and impact of a wide range of harmful content to ensure that the legislation adequately protects children from content that is harmful to them. Ofcom will have a duty to advise the Government on priority categories of harm to children and will also want to draw on evidence and views from relevant parties. That includes Barnardo’s, as raised by the noble Baroness, Lady Benjamin. I am pleased to say that my honourable friend the Minister for Tech and the Digital Economy has already met Barnardo’s in that regard.

The regulator, Ofcom, will set out the steps that companies can take to comply with their duties through statutory codes of practice. Platforms will then be required to put in place systems and processes to mitigate the risks that they have identified. Ofcom will hold companies to account both on the standard of their risk assessments and on the safety measures that they adopt and can take enforcement measures if either of these fall short of what is expected. The approach that we are taking means that children will be much less likely to encounter harmful content in the first place and platforms will no longer, for example, be able to target harmful material at children through the use of algorithms, as the noble Baroness, Lady Kidron, mentioned.

The noble Baroness, Lady Benjamin, asked why the Government cannot in the meantime bring in Part 3 of the Digital Economy Act. The Government have taken the decision to deliver the objective of protecting children from online pornography through the online safety Bill, which we are confident will provide much greater protection to children than Part 3 of the Digital Economy Act would, as it also covers social media companies, where a considerable quantity of pornographic material is available to children at the moment. It would also not be as quick a solution as I think the noble Baroness imagines to commence Part 3 of the Digital Economy Act as an interim measure. The Government would have to designate a new regulator and that regulator would need to produce and consult on statutory guidance. The Government would then need to lay regulations before Parliament ahead of any new regime coming into force. That is why we are keen, as noble Lords today have said they are as well, to do this through the online safety Bill and to do it swiftly.

We expect companies in the scope of the duties of the online safety Bill to use age-assurance technologies to prevent children from accessing content that poses the highest risk of harm. Standards have an important role to play in delivering that and Ofcom will be able to include standards for age assurance as part of its regulatory codes. Companies will either need to follow the steps in the codes, including using these standards, or demonstrate that they achieved an equivalent outcome.

The noble Baroness, Lady Kidron, asked whether the Bill would make reference to the United Nations Convention on the Rights of the Child. I cannot pre-empt the Government’s response in full to the Joint Committee on which she served, but I note in the meantime that the Bill reflects the three principles of the general comments: for the best interests of the child to be a primary consideration; on children’s right to life, survival and development; and respect for the views of the child. Of course, on that and all the recommendations, the Government will respond in full to the Joint Committee, for whose work we are very grateful.

As the noble Lord, Lord Addington, says, regulation of this nature will require effective enforcement. We have confirmed our decision to appoint Ofcom as the regulator and our intention to give it a range of enforcement powers, which will include substantial fines and, in the most serious cases, blocking. There will also be a criminal offence for senior managers who fail to ensure that their company complies with Ofcom’s information requests, to push strong compliance in this area. Ofcom will also be required to set out in enforcement guidance how it will take into account any impact on children due to a company’s failure to fulfil its duty of care.

The Bill will apply to companies that host user-generated content or enable user-to-user interaction, as well as to search services. We have taken this approach to ensure that the Bill captures the services that pose the greatest risk of harm to users and where there is current limited regulatory oversight, without placing disproportionate regulatory burdens elsewhere.

I know that the noble Baroness and the Joint Committee have recommended aligning the scope of these measures with that of the age-appropriate design code. We are grateful for their consideration of this important issue as well. It is vital that any approach is proportionate and remains workable for businesses and Ofcom to ensure that the framework is as effective as possible. We are carefully considering the Joint Committee’s recommendations and are committed to introducing the Bill as soon as possible in this parliamentary Session. In the meantime, we are working closely with Ofcom to ensure that the implementation of the framework is as swift as possible, following passage of the legislation.

I will say a bit more about the interim measures that we are taking, as noble Lords rightly asked about that. We have a comprehensive programme of work to protect children online until the legislation is in force. Ahead of the Bill, the video-sharing platform and video-on-demand regimes are already in force, with Ofcom as the regulator. They include requirements to protect children from harmful online content such as pornography. In addition, the Government have published an interim code of practice for providers to tackle online child sexual exploitation and abuse.

The noble Baroness, Lady Prashar, mentioned our work in asking the Law Commission to review existing legislation on abusive and harmful communications. The Law Commission has published its final report, putting forward recommendations for reform. These include a recommended new offence to tackle communications that encourage or incite self-harm. The Government are considering the recommendations and will set out our position in due course.

As the noble and right reverend Lord, Lord Harries of Pentregarth, said, every death is sad—many are tragic, but they are incredibly so when they involve a young person. The Government recognise the difficulties that some bereaved parents have experienced when accessing their loved ones’ data. Disclosure of data relating to a deceased person is not prevented by the UK’s data protection legislation. As the noble Lord, Lord Allan of Hallam, noted, some companies operate policies of non-disclosure to third parties, including parents, unless a user has taken active steps to nominate a person who may access his or her account after he or she dies or if there is a legal obligation to disclose the data.

We are discussing this issue with companies directly. Officials met Instagram on 22 December, for instance. We are also in discussion with the Information Commissioner’s Office about digital assets. It is important to recognise, as the Joint Committee did, that an automatic right of access is unlikely to be appropriate in every case. Some people might be concerned about the disclosure of private information or other digital assets to third parties after their death.

The Government are grateful to the Joint Committee for its recommendations in this area. While I cannot make any commitment or pre-empt the Government’s response in full, I am happy to say that we will continue to give careful consideration to this before we respond and outline our proposed next steps.

It is worth noting that coroners already have statutory powers to require evidence to be given or documents to be produced for the purpose of their inquests—this would include relevant digital data following the death of a child—with sanctions where such evidence is not given or documents produced. They are well aware of these powers.

The right reverend Prelate the Bishop of St Albans mentioned his Private Member’s Bill. As he knows, the Coroners and Justice Act 2009 is clear that it is beyond a coroner’s powers to determine why somebody died. Coroners’ investigations are about determining who died, how, when and where, but not why. However, he is right that more can be done to understand some of those circumstances. We recognise that quality information on the circumstances leading to self-harm and suicide can support better interventions to prevent them in the first place. The Department for Health and Social Care is considering including questions on gambling as part of the adult psychiatric morbidity survey this year to help establish the prevalence of suicidal tendencies linked to gambling and to improve its evidence base. As the right reverend Prelate knows, we are taking a close look at the Gambling Commission’s powers as part of our review of the Gambling Act.

The Government are deeply concerned about the impact of harmful content and activity online on children. We are committed to introducing legislation as soon as possible to ensure that platforms are held to account for this content so that future generations can have a healthy relationship with the internet. I look forward to debating that Bill when it comes to this House. In the meantime, I thank noble Lords for their contributions to today’s debate.

Sitting suspended.