[Peter Dowd in the Chair]
[Relevant Documents: Second Report of the Petitions Committee, Session 2021-22, Tackling Online Abuse, HC 766, and the Government response, HC 1224; e-petition 272087, Hold online trolls accountable for their online abuse via their IP address; e-petition 332315, Ban anonymous accounts on social media; e-petition 575833, Make verified ID a requirement for opening a social media account.]
I beg to move,
That this House has considered online harms.
It is a great pleasure to see you in the Chair, Mr Dowd. This is the first time I have had the opportunity to serve in Westminster Hall under your chairmanship—[Interruption.] In a debate about technology, this was always going to happen. It is great to see the Minister, my hon. Friend the Member for Folkestone and Hythe (Damian Collins), in his place. He is enormously respected by Members on both sides of the House. He came to this role with more knowledge of his subject than probably any other Minister in the history of Ministers, so he brings a great deal to it.
This is an important and timely debate, given that the Online Safety Bill is returning to the Commons next week. Obviously, a great deal of the debate will be in the House of Lords, so I thought that it was important to have more discussion in the House of Commons. The Online Safety Bill is a landmark and internationally leading Bill. As a number of people, including my right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright), can attest, it has been a long time in gestation—five years, including two consultations, a Green Paper, a White Paper, a draft Bill, prelegislative scrutiny, 11 sessions of the Joint Committee on the Draft Online Safety Bill chaired by my hon. Friend the Member for Folkestone and Hythe, and nine days at Committee stage in the Commons. It is complex legislation, but that is because the subject that it addresses is complex.
Some want the Bill to go further, and I have no doubt that on Report and in the Lords there will be many attempts to do that. Others think it already goes too far. The most important message about the Bill is that we need to get on with it.
Technology is a big part of children’s lives—actually, it is a big part of all our lives. The vast majority of it is good. It provides new ways of keeping in touch, and ways of enhancing education for children with special educational needs. Think of all the rows in the car that have been done away with by the sat-nav—at least those rows. My personal favourite is the thing on my phone that says, “The rain will stop in 18 minutes,” so I know when to get my sandwich. Technology changes the way we live our lives. Think about our working lives in this place. Thanks to Tony Blair and new Labour, the pager got all MPs on message and disciplined, and now WhatsApp is having exactly the opposite effect.
In particular, in the Bill and this discussion we are concerned about social media. Again, most of what social media has given us is good, but it has also carried with it much harm. I say “carried with it” because much of that harm has not been created by social media, but has been distributed, facilitated and magnified by it. In the last couple of weeks, we have been reminded of the terrible tragedy of Molly Russell, thanks to the tireless campaigning and immense fortitude of her father, Ian, and her family. The coroner concluded that social media companies and the content pushed to Molly through algorithmic recommendations contributed to her death
“in more than a minimal way”.
My right hon. Friend is making an excellent speech, and I entirely agree that the Bill needs to come forward now. The algorithm is the key part to anything that goes on, in terms of dealing with online problems. The biggest problem I have found is trying to get transparency around the algorithm. Does he agree that the Bill should concentrate on exposing the algorithms, even if they are commercially sensitive, and allowing Ofcom to pull on those algorithms so that we do not get into the horrible situation that he has described?
I absolutely agree about the centrality of the algorithms and about understanding how they work. We may come on to that later in the debate. That brings me on to my next point. Of course, we should not think of Molly’s tragedy as a single event; there have been other tragedies. There is also a long tail of harm done to young people through an increased prevalence of self-harm, eating disorders, and the contribution to general mental ill-health. All of that has a societal cost, as well as a cost to the individual. That is also a literal cost, in terms of cash, as well as the terrible social cost.
Importantly, this is not only about children. Ages 18 to 21 can be a vulnerable time for some of the issues I have just mentioned. Of course, with domestic abuse, antisemitism, racist abuse, and so on, most of that is perpetrated by—and inflicted on—people well above the age of majority. I found that the importance and breadth of this subject was reflected in my Outlook inbox over the past few days. Whenever a Member’s name is on the Order Paper for a Westminster Hall debate, they get all sorts of briefings from various third parties, but today’s has broken all records. I have heard from everybody, from Lego to the Countryside Alliance.
On that subject, I thank some of the brilliant organisations that work so hard in this area, such as 5Rights, the Children’s Charities Coalition, the National Society for the Prevention of Cruelty to Children, of course, the Carnegie Trust, the City of London Corporation, UK Finance, the Samaritans, Kick It Out, and more.
I should also note the three e-petitions linked to this subject, reflecting the public’s engagement: the e-petition to ban anonymous accounts on social media, which has almost 17,000 signatories; the petition to hold online trolls accountable, with more than 130,000 signatories; and the e-petition for verified ID to be required to open a social media account, with almost 700,000 signatories.
Such is the interest in this subject and the Online Safety Bill, which is about to come back to the Commons, that someone could be forgiven for thinking that it is about to solve all of our problems, but I am afraid that it will not. It is a framework that will evolve, and this will not be the last time that we have to legislate on the subject. Indeed, many of the things that must be done probably cannot be legislated for anyway. Additionally, technology evolves. A decade ago, legislators were not talking about the effect of livestreaming on child abuse. We certainly were not talking about the use of emojis in racist abuse. Today, we are just getting to grips with what the metaverse will be and what it implies. Who knows, in five or 10 years’ time, what the equivalent subjects will be?
From my most recent ministerial role as Minister of State for Security, there are three areas covered in the Online Safety Bill that I will mention to stress the importance of pressing on with it and getting it passed into law. The first is child abuse, which I have just mentioned. Of course, some child abuse is perpetrated on the internet, but it is more about distribution. Every time that an abusive image of a child is forwarded, that victim is re-victimised. It also creates the demand for further primary abuse. I commend the agencies, the National Crime Agency and CEOP—Child Exploitation and Online Protection Command—and the brilliant organisations, some of which I have mentioned, that work in this area, including the international framework around NCMEC, the National Centre for Missing and Exploited Children, in the United States.
However, I am afraid that it is a growth area. That is why we must move quickly. The National Crime Agency estimates that between 550,000 and 850,000 people pose, in varying degrees, a sexual risk to children. Shall I repeat those numbers? Just let them sink in. That is an enormous number of people. With the internet, the accessibility is much greater than ever before. The Internet Watch Foundation notes a growth in sexual abuse content available online, particularly in the category known as “self-generated” imagery.
The second area is fraud, which is now the No. 1 category of crime in this country by volume—and in many other countries. Almost all of it has an online aspect or is entirely online. I commend the Minister, and the various former Ministers in the Chamber, on their work in ensuring that fraud is properly addressed in the Bill. There have been three moves forward in that area, and my hon. Friends the Members for Hexham (Guy Opperman) and for Barrow and Furness (Simon Fell) may speak a bit more about that later. We need to ensure that fraud is in scope, that it becomes a priority offence and, crucially, that advertising for fraud is added to the offences covered.
I hope that, over time, the Government can continue to look at how to sharpen our focus in this area and, in particular, how to line up everybody’s incentives. Right now, the banks have a great incentive to stop fraud because they are liable for the losses. Anybody who has tried to make an online payment recently will know what that does. When people are given a direct financial incentive—a cost—to this thing being perpetrated, they will go to extraordinary lengths to try to stop it happening. If we could get that same focus on people accepting the content or ads that turn out to be fraud, imagine what we could do—my hon. Friend may be about to tell us.
I commend my right hon. Friend for the work that he has done. He knows, because we spoke about this when we both were Ministers, that the key implementation once this Bill is law will be fraudulent advertising. I speak as a former Pensions Minister, and every single day up and down this country our pensioners are defrauded of at least £1 million, if not £2 million or £3 million. It is important that there are targeted penalties against online companies, notably Google, but also that there are police forces to take cases forward. The City of London Police is very good, but its resources are slim at present. Does he agree that those things need to be addressed as the Bill goes forward?
I agree. Some of those matters should be addressed in the Bill and some outside it, but my hon. Friend, whom I commend for all his work, particularly on pensions fraud and investment fraud, is absolutely right that as the balance in the types of crimes has shifted, the ways we resource ourselves and tool up to deal with them has to reflect that.
Could you give me an indication, Mr Dowd, of how many Members are speaking in this debate after me?
I shall accelerate in that case. The third area I want to mention, from my previous role as Security Minister, is disinformation. I welcome what is called the bridge that has been built between the Online Safety Bill and the National Security Bill to deal specifically with state-sponsored disinformation, which has become a tool of war. That probably does not surprise anybody, but I am afraid that, for states with a hostile intention, it can become, and is, a tool in peacetime. Quite often, it is not necessarily even about spreading untruths—believe it or not—but just about trying to wind people up and make them dislike one another more in an election, referendum or whatever it may be. This is important work.
Health disinformation, which we were exercised about during the coronavirus pandemic, is slated to be on the list of so-called legal but harmful harms, so the Bill would also deal with that. That brings me to my central point about the hardest part of this Bill: the so-called legal but harmful harms. I suggest that we actually call them “harmful but legal”, because that better captures their essence, as our constituents would understand it. It is a natural reaction when hearing about the Online Safety Bill, which will deal with stuff that is legal, to say, “Well, why is there a proposed law going through the British Parliament that tries to deal with things that are, and will stay, legal? We have laws to give extra protection to children, but adults should be able to make their own choices. If you start to interfere with that, you risk fundamental liberties, including freedom of speech.” I agree with that natural reaction, but I suggest that we have to consider a couple of additional factors.
First, there is no hard line between adults and children in this context. There is not a 100%—or, frankly, even 50%—reliable way of being able to tell who is using the internet and whether they are above or below age 18. I know that my hon. Friend the Member for Gosport (Dame Caroline Dinenage), among others, has been round the loop many times looking at age verification and so-called age assurance. It is very difficult. That is why a couple of weeks ago a piece of Ofcom research came out that found 32%—a third—of eight to 17-year-old social media users appear to be over 18. Why is that? Because it is commonplace for someone to sign up to TikTok or Snapchat with the minimum age of 13 when they are 10. They must give an age above 13 to be let in. Let us say that that age limit was set at 14; that means that when they are 14, it thinks they are 18—and so it carries on, all the way through life.
The right hon. Member and many other Members present will know that leading suicide prevention charities, including Samaritans and the Mental Health Foundation, are calling on the Government to ensure that the Online Safety Bill protects people of all ages from all extremely dangerous suicide and self-harm content. The right hon. Member makes very good points about age and on the legal but harmful issue. I hope very much that the Government will look at this again to protect more people from that dangerous content.
I thank the hon. Lady; I think her point stands on its own.
The second additional factor I want to put forward, which may sound odd, is that in this context there is not a hard line between what is legal and what is not. I mentioned emoji abuse. I am not a lawyer, still less a drafter of parliamentary legislation—there are those here who are—but I suggest it will be very hard to legislate for what constitutes emoji abuse in racism. Take something such as extremism. Extremist material has always been available; it is just that it used to be available on photocopied or carbon-copied sheets of paper. It was probably necessary to go to some draughty hall somewhere or some backstreet bookshop in a remote part of London to access it, and very few people did. The difference now is that the same material is available to everyone if they go looking for it; sometimes it might come to them even if they do not go looking for it. I think the context here is different.
This debate—not the debate we are having today, but the broader debate—is sometimes conducted in terms that are just too theoretical. People sometimes have the impression that we will allow these companies to arbitrarily take down stuff that is legal but that they just do not like—stuff that does not fit with their view of the world or their politics. On the contrary, the way the Bill has been drafted means that it will require consistency of approach and protect free speech.
I am close to the end of my speech, but let us pause for a moment to consider the sorts of things we are talking about. My right hon. Friend the Member for Mid Bedfordshire (Ms Dorries) made a written ministerial statement setting out an indicative list of the priority harms for adults. They are abuse and harassment—not mere disagreement, but abuse and harassment—the circulation of real or manufactured intimate images without the subject’s consent; material that promotes self-harm; material that promotes eating disorders; legal suicide content; and harmful health content that is demonstrably false, such as urging people to drink bleach to cure cancer.
I suggest that when people talk about free speech, they do not usually mean those kinds of things; they normally mean expressing a view or being robust in argument. We have the most oppositional, confrontational parliamentary democracy in the world, and we are proud of our ability to do better, to make better law and hold people to account through that process, but that is not the same thing as we are talking about here. Moreover, there is a misconception that the Bill would ban those things; in fact, the Bill states only that a service must have a policy about how it deals with them. A helpful Government amendment makes it clear that that policy could be, “Well, we’re not dealing with it at all. We are allowing content on these things.”
There are also empowerment tools—my hon. Friend the Member for Stroud (Siobhan Baillie) may say more about that later in relation to anonymity—but we want users to be in control. If there is this contractual relationship, where it is clearly set out what is allowed in this space and someone signs up to it, I suggest that enhances their freedoms as well as their rights.
I recognise that there are concerns, and it is right to consider them. It may be that the Bill can be tightened to reassure everybody, while keeping these important protections. That might be around the non-priority areas, which perhaps people consider to be too broad. There might also be value in putting the list of priority harms in the Bill, so that people are not concerned that this could balloon.
As I said at the start, the Minister, my hon. Friend the Member for Folkestone and Hythe, knows more about this than probably any other living human being. He literally works tirelessly on it and is immensely motivated, for all the right reasons. I have probably not said anything in the past 10 minutes that he did not already know. I know it is a difficult job to square the circle and consider these tensions.
My main message to the Minister and the Government is, with all the work that he and others have done, please let us get on with it. Let us get the Bill into law as soon as possible. If changes need to be made to reassure people, then absolutely let us make them, but most of all, let us keep up the momentum.
It is a real pleasure to serve under your chairship, Mr Dowd. I congratulate the right hon. Member for East Hampshire (Damian Hinds) on securing this important debate. Many people will be watching who have taken a keen interest in the Online Safety Bill, which is an important piece of legislation, and the opportunities it offers to protect people from harmful, dangerous online content. I also welcome the Minister to his place. I am sure he will listen carefully to all the contributions.
My interest in the Bill is constituency based. I was approached by the family of a young man from my constituency called Joe Nihill, a popular former Army cadet who sadly took his own life at the age of 23 after accessing harmful online content related to suicide. Joe’s mother Catherine and sister-in-law Melanie have run an inspiring campaign, working with the Samaritans to ensure that the law is changed. In the note Joe left before he sadly took his life, he referred to such online content. One of his parting wishes was that what happened to him would not happen to others.
I want to ensure that the Minister and the Government take full opportunity of the Bill, so let me talk briefly about two amendments that might strengthen it. We want to protect people of all ages, and ensure that smaller online platforms as well as the larger ones are covered. Two related amendments have been tabled: amendment 159 by the hon. Member for Aberdeen North (Kirsty Blackman) and proposed new clause 16 by the right hon. Member for Haltemprice and Howden (Mr Davis). I know that they are backed by the Samaritans and the inspiring campaigners from my constituency.
Amendment 159, relating to protecting people of all ages, addresses the point that clearly harmful suicide and self-harm content can be accessed by over-18s, and vulnerable people are not limited to those under 18 years of age. Joe Nihill was 23 when he sadly took his own life after accessing such content.
As it is currently drafted, the Bill’s impact assessment states that of the platforms in scope
“less than 0.001% are estimated to meet the Category 1 and 2A thresholds”.
It is estimated that only 20 platforms will be required to fulfil category 1 obligations. If the Bill is enacted in its current form, unamended, then smaller platforms, where some of the most harmful suicide and self-harm content can be found, will not even need to consider the risk that any harmful but legal content on their site poses to adult users. Amendment 159 presents a real opportunity for the Government to close a loophole and further improve the legislation to ensure that people of all ages are protected.
The issue is so relevant. Between 2011 and 2015, 151 people who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm, and 82% of those people were over 25. The Government must retain regulation of harmful but legal content, but they should extend the coverage of the Bill to smaller platforms where some of the most harmful suicide and self-harm content can be found. I urge the Government to carefully consider and adopt amendment 159.
Finally on closing all the related loopholes in the Bill, new clause 16 tabled by the right hon. Member for Haltemprice and Howden would create a new communications offence of sending a message encouraging or assisting another person to self-harm. That offence is crucial to ensuring that the most harmful self-harm content is addressed on all platforms. As the Minister knows, Samaritans was pleased that the Government agreed in principle to create a new offence of encouraging or assisting self-harm earlier this year. That new offence needs to be created in time to be part of this legislation from the outset. We do not want to miss this opportunity. The Law Commission has made recommendations in this regard.
I urge the Government to make sure that the Bill takes all possible opportunities. I know that the Minister is working hard on that, as the right hon. Member for East Hampshire said. I plead with the Minister to accept amendment 159 and new clause 16, so that we do not miss the opportunity to ensure that people over 18 are protected by the legislation and that even the smaller platforms are covered.
The Bill, which will come back before the House next week, is a historic opportunity, and people across the country have taken a close interest in it. My two constituents, Catherine and Melanie, are very keen for the Government not to miss this opportunity. I know that the Minister takes it very seriously and I look forward to his response, which I hope will include reassuring words that the amendments on over-18s and smaller platforms will both be adopted when the Bill returns next week.
It is a pleasure to serve under your stewardship, Mr Dowd. I congratulate my right hon. Friend the Member for East Hampshire (Damian Hinds) on securing this vital and timely debate. Time is really of the essence if we are going to deliver the Online Safety Bill in this Session.
The scenario whereby the Bill falls is almost unthinkable. Thousands of man hours have been put in by the team at the Department for Digital, Culture, Media and Sport, by the Home Office team, and by the Joint Committee on the Draft Online Safety Bill, which the Minister chaired so brilliantly. There have been successive ministerial refinements by quite a few of the people in the Chamber, and numerous parliamentary debates over many years. Most importantly, the stakes in human terms just could not be higher.
As my right hon. Friend said, that was painfully underlined recently during the inquest into Molly Russell’s death. Her story is well documented. It is stories like Molly’s that remind us how dangerous the online world can be. While it is magnificent and life-changing in so many ways, the dark corners of the internet remain a serious concern for children and scores of other vulnerable people.
Of course, the priorities of the Bill must be to protect children, to tackle serious harm, to root out illegal content and to ensure that online platforms are doing what they say they are doing in enforcing their own terms and conditions. Contrary to the lazy accusations, largely by those who have not taken the time to read this hefty piece of legislation, the Bill does not set out to restrict free speech, to protect the feelings of adult users or to somehow legislate for people’s right not to be offended.
Following on from other Members, I will talk about the legal but harmful issue. There is no easy way to define “legal but harmful”, because it is so opaque. Even the name is clunky and unappetising, as my right hon. Friend the Member for East Hampshire said. My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) sometimes uses the phrase “lawful but awful”, which often seems more appropriate, but it does not necessarily work from a legislative point of view.
If Molly Russell’s tragic case teaches us anything, it is that dreadful, harmful online content cannot be defined simply by what is strictly legal or illegal, because algorithms do not differentiate between harmless and harmful content. They see a pattern and they exploit it. They are, quite simply, echo chambers. They take our fears and our paranoia, and surround us with unhealthy voices that simply reinforce them, however dangerous or hateful they are. Fundamentally, they breadcrumb users into more content, slowly, piece by piece, cultivating an interest. They take us down a path we might not otherwise have followed—one that is seemingly harmless at the start, but that eventually is anything but.
We have a moral duty to keep children safe on online platforms, but we also have a moral duty to keep other users safe. People of all ages need to be protected from extremely harmful online content, particularly around suicide, self-harm and eating disorders, where the line between what is legal and what is illegal is so opaque. There is an inherent legal complexity in defining what legal but harmful really means.
It feels like this part of the Bill has become a lightning rod for those who think it will result in an overly censorious approach. That is an entirely misleading misinterpretation of what it seeks to achieve. I feel that, perversely, not putting in place protections would be inherently more of a bar to freedom of speech, because users’ content can be taken down at the moment with random unpredictability and without any justification or redress. Others are afraid to speak up, fearing pile-on harassment and intimidation from anonymous accounts.
The fact is that this is a once-in-a-generation opportunity to make this legislation effective and meaningful.
It is a pleasure to serve under your chairmanship, Mr Dowd.
I thank the right hon. Member for East Hampshire (Damian Hinds) for securing this important debate, which has far-reaching impacts for the whole country. I welcome the Minister to his place. I look forward to the Online Safety Bill completing its passage in this Session. We have had to wait quite some time. Four years ago, the Government promised to tighten the law on online harms. The delay has, unfortunately, had devastating impacts for many people in this country: £3 billion has been lost to fraud and 60,000 offences relating to online sexual abuse material and grooming have been committed.
As many Members have said, cyber-bullying has a disastrous effect on young people and, indeed, everyone across our communities. We know that big tech companies will not regulate themselves if it is not in their interest to do so. Sadly, it has taken a while for the necessary actions that we need to be taken. Instead, leading charities have been forced to support the many families who have been affected. I will focus on a key organisation in my constituency that does excellent work fighting for young disabled people and their rights.
Coventry Youth Activists is a wonderful organisation that has played a central role in campaigning for change in the way that disability hate is handled by social media platforms. CYA told me of a staggering 52% increase in online hate crime in 2021; however, their attempt to reach out and ask social media companies, specifically Facebook, to look into how hateful, ableist crime is posted on their platforms and to review their algorithms and respond effectively has not really been taken up, and certainly not by Facebook.
We cannot continue this way. Many young people have suffered devastating impacts. There are tragic consequences to the bullying that many young people face online. One of my constituents told me that when he went to an online platform and asked to volunteer for a community organisation, someone said to him, “What is this giraffe thing? I hope he doesn’t procreate.” That had a significant impact on his mental health and ability to feel valued within the community. That is absolutely wrong. No one should have to experience such bullying.
As things stand, the online world is a space where bullies feel emboldened, because they know that there are zero consequences for their shameful actions. We cannot allow that to continue. Bullies need to know that when they post harmful, hateful things online, they will be dealt with effectively. I urge the Minister to meet with me and Coventry Youth Activists to discuss the important work they have been doing and to ensure that no young person is bullied online, specifically those with a disability. I want to see a world in which the virtual space is a safe space for everybody, regardless of whether one is able or has a disability.
Lastly, I wish to mention the importance of eradicating misinformation and protecting young people. As the right hon. Member for East Hampshire said, misinformation is having a significant impact online and is making the online space more difficult for many people. I encourage the Minister to ensure that action is taken to make the digital space a safe space for young people.
It is a pleasure to serve under your chairmanship, Mr Dowd.
As many Members present know, I have been campaigning for verification options on social media to tackle anonymous abuse. I understand that new Prime Ministers and Secretaries of State like to put their own stamp on legislation, but I am appealing for no more delays in protecting children and adults from online abuse. In the time that I have been working on this issue I have had two children, dealt with a pandemic, a war and the deaths of two beloved monarchs, and worked on thousands of cases for people in Stroud; if little old me can fit in that much, I know that the massive Government machine and fantastic civil servants want to get on with this legislation, and we can do it.
I do not mind re-fighting the case for tackling anonymous abuse, because I love working with the Clean up the Internet gang, and anonymity is a really important part of this issue. The ability to operate anonymous accounts is abused on a huge scale and is fuelling racist, antisemitic and sexist abuse, pictures of people’s genitalia being sent around, name calling, bullying, online fraud, misinformation, scams, and the evasion of the law. It is much scarier to receive such abuse when people do not know who is sending it. That is why we have to tackle these issues.
It is not rocket science to understand how the online disinhibition effect makes anonymous users feel less accountable and less responsible for their actions. Recent research by the charity Reset found that those in the much-fêted red wall seats see tackling abuse from anonymous accounts as a top priority for improving the experience of online life. The University of Sheffield and the children’s charity 5Rights, which has played such an important role, have found that the ability to create anonymous accounts is a risky design feature. I urge the Minister to look again at the work of the Antisemitism Policy Trust, which is a doughty champion on this issue. We know that our Jewish communities have suffered dreadfully, with increased abuse and threats in recent years. Issues surrounding the categorisation of platforms and risk factors are well known, but we need to use this opportunity to bring about change.
Our proposals would require social media platforms to give users the choice to verify their own accounts. They would make it very obvious if someone is verified and there would be the option to follow or be followed by only verified accounts. That would not stop the ability to be unverified. People could remain unverified, and that would assist whistleblowers, journalists and anyone in a marginalised group who wants to remain anonymous. In our plans, users could still be Princess or President So-and-so with a funny Twitter handle, but they would know that there is information behind the scenes.
Let me be clear: social media as it stands is damaging free speech. If someone is going to get a rape threat for saying what they think, they will not speak freely. We have to make these changes. The Minister is so brilliant in this policy area, and I urge him to make changes as soon as possible.
It is an honour to serve under your chairmanship, Mr Dowd. I thank my right hon. Friend the Member for East Hampshire (Damian Hinds) for securing this debate. It is a hackneyed phrase, but the Online Safety Bill is important and genuinely groundbreaking. There will always be a balance to strike between allowing free speech and stopping harms. I think we are on the right side of that balance, but we may need to come back to it later, because it is crucial.
I want to cover two topics in a short amount of time. The first is online harms through social media platforms, touching on the legal but harmful and small, high-harm platforms, and the second is fraud. Starting with fraud, I declare an interest, having spent a decade in that world before I came here.
I thank my right hon. Friend for clarifying that for me—although I would be better off now had I been on the other side of the fence.
Fraud is at epidemic levels. Which? research recently found that six in 10 people who have been victims of fraud suffered significant mental health harms as a result. I use this example repeatedly in this place. In my past life I met, through a safeguarding group, an old lady who accessed the world through her landline telephone. She was scammed out of £20,000 or so through that phone, and then disconnected from the rest of the world afterwards because she simply could not trust that phone when it rang anymore.
We live in an increasingly interconnected world where we are pushing our services online. As we are doing that we cannot afford to be disconnecting people from the online world and taking away from them the services we are opening up to them. That is why it is essential to have vital protections against fraud and fraudulent adverts on some of the larger social platforms and search engines. I know it is out of the scope of this debate but, on the point made by my hon. Friend the Member for Hexham (Guy Opperman), that is also why it is crucial to fund the law enforcement agencies that go after the people responsible.
My right hon. Friend the Member for East Hampshire is right: banks have a financial motivation to act on fraud. They are losing money. They have the incentive. Where that motivation is not there, and where there is a disincentive for organisations to act, as is especially the case with internet advertising, we have to move forward with the legislation and remove those disincentives.
On harms, my right hon. Friend the Member for East Hampshire is right to mention the harmful but legal. We have to act on this stuff and we have to do it quickly. We cannot stray away from the problems that currently exist online. I serve on the Home Affairs Committee and we have seen and examined the online hate being directed at footballers; the platforms are not acting on it, despite it being pointed out to them.
When it comes to disinformation and small, high-harm platforms—
If it was not intimidating enough to be with the great and the good of the Online Safety Bill, trying to say everything I want in three minutes is even more of a challenge.
I will be brief. I came to this issue through body image; that is why I learned what I have on this subject. I simply ask for two things. In his speech, my right hon. Friend the Member for East Hampshire (Damian Hinds) said that this is about frameworks. I have two suggestions that I think would make a huge difference in respect of future-proofing the legislation and providing a framework. The first is to build on the fantastic work of my hon. Friend the Member for Stroud (Siobhan Baillie). We are talking about having authenticated anonymous and non-anonymous accounts. Giving the end user the choice of whether they want to go into the wild west is fundamental.
Now that, through the Content Authenticity Initiative —to which 800 companies around the world have signed up—the technology exists to have an open standard of transparency in respect of how images are taken, from the camera to how they are put in place, we have a framework that runs around the world that means people can make the same choice about images as about accounts. If we future-proof that in legislation, we simply allow the user to choose to switch on that tool and see images that we know are verified on an open source. It is not about someone making a decision; it is simply about understanding where the image comes from, how it got there, how it was made and who passed it on. That is an incredibly powerful and incredibly simple way to create a protective framework.
That leads me to my second, possibly more important, point, which was raised by my hon. Friend the Member for Gosport (Dame Caroline Dinenage). Algorithms are king when it comes to social media. Controlling them is very difficult, but someone should be responsible. In schools we have safeguarding leads for dealing with vulnerable people, and in councils we have financial named people, so why on earth do we not have a named person legally responsible for the algorithm in a company? We have it with GDPR. That would allow anyone in this debate, anyone in the police force, anyone in Ofcom or any member of the public to go that person and say, “Why is your algorithm behaving in the way it is?” Every time I have tried to do that, I have been told that it is commercially sensitive and that there is a team somewhere else in the world that deals with it.
I know that Ofcom has the power to deal with this issue, but it is a one-off notice when it is requested. I simply think that stating that there is a named person legally responsible for the algorithm would change behaviours, because their head would be on the chopping block if they got it wrong. This is about responsibility. That is what the Bill provides, and that is why I am advocating for those two points.
I want to offer some help to my right hon. Friend the Member for East Hampshire (Damian Hinds) in looking further afield for evidence of this sort of thing working well. That evidence comes from the Council of Europe, which has been very active in this policy area for many years. It works with its 46 member states, the private sector, civil society and other actors to shape an internet based principally on human rights. It aims to ensure that the internet provides a safe and open environment where freedom of expression and freedom of assembly, diversity, culture, education and knowledge can all flourish.
The key pillar for the protection of human rights online is the European convention on human rights. The European Court of Human Rights, which rules on applications, has already delivered landmark judgments concerning the online environment—in particular, in connection with the right to freedom of expression, the right to access information and the right to privacy.
The Lanzarote convention, which we have already ratified, deals in particular with child abuse, which is of great concern to me. It deals with the fact that the form of online abuse keeps changing by involving children in the whole of the process. That is adjusted according to their age. Children and young people who exercise their right to freely express their views as part of this process must be protected from harm, including intimidation, reprisals, victimisation and violations of their right to privacy.
I urge my right hon. Friend and the Minister to look at what the Council of Europe has been doing. It is not part of the EU—they do not have to get tied up with all that—and it represents 46 countries. The issue has been looked at in great depth across wider Europe. They could learn a lot from that experience.
Another day, another Westminster Hall speech.
When I was the Pensions Minister I saw, sadly, hundreds of our constituents being defrauded of millions of pounds every single day by fake advertisers, primarily on Google, Instagram, Facebook and various other social media providers. The offences that have been added in clauses 34 to 36 of the Online Safety Bill are welcome, but I want an assurance from the Minister that there is provision against unregulated advertisers.
I give the example of Aviva, which gave evidence to the Work and Pensions Committee. It indicated that there were 55 separate fake Aviva sites advertising on Google for financial services. Constituents, particularly the elderly and the vulnerable, were being misled by those people and were signing away significant proportions of money. I hope the provisions in clause 36 cover that situation, but I would be nervous that the Minister would rely on consumer protection in respect of the unfair trading regulations and the actions of the Competition and Markets Authority. I mean no disrespect, but those provisions are pretty ineffective and do not really address these points.
To deal with such issues, the answer is clearly to have a burden of proof on the recipient of the advert that they are vicariously liable for the content they have on their site. That would have the massive benefit, as identified by my right hon. Friend the Member for East Hampshire (Damian Hinds), of putting the burden on the site to justify the content on its site, and there should be consequential fines that should be significant and repeated in their actions. It is very important that the Minister works with the new Home Office teams and that the police forces that are going to take these issues forward are beefed up considerably, because there simply is currently not enough resource to address these issues.
I thank organisations such as The Times—Matt Dathan has done good work on this issue. A lack of implementation will not be for a lack of money. We should bear in mind that Google, or Alphabet, made $14 billion profit last quarter. Its ability to regulate and follow through—to take the work that it is required to do by the Bill and to check advertisers and be responsible for the content, to put it bluntly—is very do-able, under all circumstances. I strongly urge the Minister to double-check that unregulated advertisers are covered in clause 36 and that there will be genuine fines and vicarious liability going forward.
It is a pleasure to speak in this debate, Mr Dowd. I follow a number of excellent speeches. The most excellent was from my right hon. Friend the Member for East Hampshire (Damian Hinds). He said many of the things that I would say, which is just as well given the time constraint I face.
Many people have said that the Bill will be back on Tuesday. I do not expect the Minister to confirm the business for next week, but if it does not come back next Tuesday, we will have a difficulty. The delay to the Bill must be either because people in the Government believe that it can be made perfect, or because they believe that it can be made less difficult. Neither of those two things are possible.
The Bill will always be imperfect. However hard many of us have worked to get it there, it will never be perfect, and it needs to be brought forward anyway. If people think the Bill’s fundamental choices will become easier by the passage of time, they are fundamentally mistaken. This will always be a difficult set of choices, but those choices need to be made. As my right hon. Friend the Member for East Hampshire said, when it comes to the most contentious part of the Bill—which is only about eight, nine or maybe 10 clauses of 190 or so—on what we shall now refer to as “harmful but legal” material, three things need to be understood by those who believe that that part of it is unacceptable.
First, as others have said, we should start with what the Bill actually says—always a good place to start. There is an important balancing duty on all platforms to protect freedom of speech, in addition to the duties they have to protect others from harm.
Secondly, as my right hon. Friend the Member for East Hampshire said, the platform is required to describe how it will handle harmful material; it is not required to remove that material automatically. That is not well understood. I would add that if the Government are to do any more work on the Bill, a definition of what is meant by harmful would be helpful and necessary.
We must understand that we regulate in other environments beyond the confines of the criminal law. The objective of this legislation has always been to create a more level playing field between the online world and every other world. We should remind ourselves that that is where the Bill starts and continues.
Thirdly, as my right hon. Friend also said, the status quo does not restrict the platforms from taking down whatever they like now. Anyone worried about freedom of speech should worry about the situation that we have today, not the situation that we will have under this legislation.
The fundamental point is that we have to get on with it. People have talked about the Bill being world leading, and it is, but we can only lead if we go first. Many others are also developing legislation. If we do not succeed in being world leading, we will miss an opportunity to set the standard in this legislation and regulation. Most importantly, we will let down our own citizens, who have a right to be kept safer online than they are.
I thank the right hon. Member for East Hampshire (Damian Hinds) for securing the debate. As he said, it is the right time to have this discussion, as one of the last opportunities to do so before the legislation leaves the House of Commons. He mentioned a number of organisations that have been in touch and have assisted with information. I do not think he mentioned—I apologise if he did—Refuge and Girlguiding, which both do excellent work and have provided an awful lot of useful information, particularly on how women and girls experience the online world. I accept that he could not possibly have covered every organisation in the time that he had to speak.
I apologise to hon. Members for the lack of Scottish National party colleagues here, which is not intentional: three others were supposed to attend, but for genuinely good reasons that I cannot pass on, they did not. I apologise for the fact that I am the only representative of the SNP—it was not intentional.
I want to pass on a comment from my hon. Friend the Member for Glasgow Central (Alison Thewliss), who highlighted to me what happened to St Albert’s Primary School at the beginning of this month or the tail end of last month. The First Minister of Scotland went to visit the school on 30 September to celebrate the work that it was doing on tackling climate change. As a result, the school was subject to horrific racist abuse. Thousands of racist messages were sent to St Albert’s Primary. I want to highlight that, because it is one of the reasons that we need this legislation. That abuse was aimed specifically at children and was genuinely horrific. I urge the Minister to look at that case so that he is aware.
The Bill has been needed for 30 years. It is not just something that we need now; we have needed it for a long time. I am very pleased that the Commons stages are nearly completed. Along with all other voices here, I urge the Government to please let the Bill come back to us so that we can finish our debate on it and it can complete its Commons stages. I feel as though I have spent quite a significant portion of my life dealing with the Bill, but I recognise that that is nothing compared with the hours that many hon. Members, organisations and staff have put in. It has been uppermost in my mind since the commencement of the Bill Committee earlier this year.
The internet is wonderful and brilliant. There are so many cool and exciting things to do on it. There are so many ways in which it makes our lives easier and enables people to communicate with each other. I can be down here and Facetime my children, which would not have been possible had I been an MP 20 or 30 years ago. Those things are great. It is brilliant for children to be able to access the internet, to be able to access games and to be able to play. It is amazing that there is a new playground for people—one that we did not have 30 years ago—and these are really good things. We need to make sure that the legislation that comes in is permissive and allows those things to continue to happen, but in a way that is safe and that protects children.
Child sexual abuse has been mentioned. I do not want to go into it too much, but for me that is the key thing about the Bill. The Bill largely covers what I would hope it would cover in terms of child sexual abuse. I strenuously resist any suggestion that we need to have total end-to-end encryption that cannot be looked at even if there is suspicion of child sexual abuse, because it is paramount that we protect children and that we are able to catch the perpetrators sharing images.
We have talked about the metaverse and things in the future, but I am still concerned that some of the things that happen today are not adequately covered by the scope of the Bill. I appreciate what the hon. Member for Leeds East (Richard Burgon) said about amendment 159, which is incredibly important. It would allow Ofcom, which is the expert, to classify additional sites that are incredibly harmful as category 1. It would not be down to the Government to say, “We’re adding this one site.” It would be down to Ofcom, the expert, to make those decisions.
Social media is not just Facebook or Twitter. It is not just the way that older adults interact with each other on the internet. It is Fortnite, Discord, Twitch, Snapchat and Roblox. I do not whether Members heard “File on 4” last night, but it was scathing in its criticism of Roblox and the number of horrific experiences that children are subjected to, on a platform that is supposed to be safe. It is promoted as a safe space for children, and it is absolutely not.
I am still massively concerned about clause 49, which talks about exempting
“one-to-one live aural communications”.
If one-to-one live aural communications are exempted, a one-to-one communication on Discord will be exempt from the legislation and will not count as user-generated content, even though it is user-generated content. I understand why the Government have put that in the Bill—it is about exempting telecoms, and I get that—but they have accidentally exempted a platform that groomers use in order to get children off Roblox, Fortnite or whatever they are playing and on to Discord, where they can have a conversation with those children. I am absolutely clear that clause 49 needs to be sorted so that the things the Government want to be exempted are still exempted, but the things that need to be in scope are in scope.
A point was made about the level of addiction, and the level of harm, that can be caused by algorithms. The idea of having a named person is very smart, and it is something that I would wholeheartedly support. It makes a huge amount of sense to include that in the Bill.
We have had an awful lot of chaos in the past wee while. Things have not looked as we expected them to look on any given day—things are changing in a matter of hours—but whatever chaos there is, the Government need to be clear that this issue is really important. It transcends party lines, arguments within the Conservative party and all of that. This is about protecting children and vulnerable people, and ensuring that we have protections in place. We need to make sure that legal but harmful is included in the Bill.
The hon. Member for Leeds East talked about ensuring that vulnerable adults are included in the Bill. We cannot just have provisions in place for children when we are aware that a huge number of adults are vulnerable for various reasons—whether that is because of mental health conditions, learning difficulties or age—and are potentially not protected if legal but harmful does not make it over the final hurdle. I urge the Minister to do that. The key thing is to please bring the Bill back so that we can get it into legislation.
It is always a pleasure to serve under your chairship, Mr Dowd. I am grateful to be here representing the Opposition in this important debate. This is the first time I have overwhelmingly agreed with every single excellent contribution in this Chamber. That goes to show that, as my friend the hon. Member for Aberdeen North (Kirsty Blackman) said, this does cross party lines and is not a political issue—at least, it should not be. There is huge cross-party consensus in this place, and the other place, about getting the Bill on the statute book and in action to protect everybody on the internet.
I pay particular tribute to the right hon. Member for East Hampshire (Damian Hinds) who, as a former Education Secretary, comes at this debate with a huge breadth of knowledge and experience. He is a former colleague of mine; we sat together on the Digital, Culture, Media and Sport Committee, where we scrutinised this legislation and these issues in depth. I know it is an issue he cares very deeply about. I echo his and other Members’ sentiments on the reappointment of the Minister, who comes at this with a breadth of experience and cares deeply. I am very pleased to see him in his post.
Regulation to tackle online abuse was first promised many years ago. In the initial White Paper, the Conservatives promised world-leading legislation. However, when the draft Online Safety Bill was published in May 2021, those proposals were totally watered down and incomplete. The Bill is no longer world leading. Since it was first announced that this Government intended to regulate the online space, seven jurisdictions have introduced online safety laws. Although those pieces of legislation are not perfect, they are in place. In that time, online crime has exploded, child sex abuse online has become rife and scams have continued to proliferate. The Minister knows that, and he may share my frustration and genuine concern at the cost that the delay is causing.
I recognise that we are living in turbulent political times, but when it comes to online harms, particularly in the context of children, we cannot afford to wait. Last week, the coroner’s report from the tragic death of Molly Russell brought into sharp relief the serious impact that harmful social media content is having on young people across the UK every day. Let me be clear; Molly Russell’s death is a horrific tragedy. I pay tribute to her father Ian and her family, who have, in the most harrowing of circumstances, managed to channel their energy into tireless campaigning that has quite rightly made us all sit up and listen.
Molly’s untimely death, to which, as the coroner announced last week, harmful social media content was a contributing factor, has stunned us all. It should force action from the Government. While I was pleased to note in the business statement last week that the Online Safety Bill will return to the House on Tuesday, I plead with the Minister to work with Labour, the SNP and all parties to get it through, with some important amendments. Without measures on legal but harmful content—or harmful but legal, as we are now referring to it—it is not likely that suicide and self-harm content such as that faced online by Molly or by Joe Nihill, the constituent of my hon. Friend the Member for Leeds East (Richard Burgon), will be dealt with.
Enough is enough. Children and adults—all of us—need to be kept safe online. Labour has long campaigned for stronger protections online for children and the public, to keep people safe, secure our democracy and ensure that everyone is treated with decency and respect. There is broad consensus that social media companies have failed to regulate themselves. That is why I urge the Minister to support our move to ensure that those at the top of multi-million-pound social media companies are held personally accountable for failures beyond those currently in the Bill relating to information notices.
The Online Safety Bill is our opportunity to do better. I am keen to understand why the Government have failed to introduce or support personal criminal liability measures for senior leaders who have fallen short on their statutory duty to protect us online. There are such measures in other areas, such as financial services. The same goes for the Government’s approach to the duties of care for adults under the Bill—what we call harmful but legal. The Minister knows that the Opposition has concerns over the direction of the Bill, as do other Members here today.
Freedom of speech is vital to our democracy, but it absolutely must not come at a harmful cost. The Bill Committee, which I was a member of, heard multiple examples of racist, antisemitic, extremist and other harmful publishers, from holocaust deniers to white supremacists, which would stand to benefit from the recognised news publisher exemption as it currently stands, either overnight or by making minor administrative changes.
In Committee, in response to an amendment from my hon. Friend the Member for Batley and Spen (Kim Leadbeater), the Minister promised the concession that Russia Today would be excluded from the recognised news publisher exemption. I am pleased that the Government have indeed promised to exclude sanctioned news titles such as Russia Today through an amendment that they have said they will introduce at a later stage, but that does not go far enough. Disinformation outlets rarely have the profile of Russia Today. Often they operate more discreetly and are less likely to attract sanctions. For those reasons, the Government must go further. As a priority, we must ensure that the current exemption cannot be exploited by bad actors. The Government must not give a free pass to those propagating racist or misogynistic harm and abuse.
Aside from freedom of speech, Members have raised myriad harms that appear online, many of which we tried to tackle with amendments in Committee. A robust corporate and senior management liability scheme for routine failures was rejected. Basic duties that would have meant that social media companies had to publish their own risk assessments were rejected. Amendments to bring into scope small but high-harm platforms that we have heard about today were also rejected. The Government would not even support moves to name violence against women and girls as a harm in the Bill, despite the huge amount of evidence suggesting that women and people of colour are more at risk.
Recent research from the Centre for Countering Digital Hate has found that Instagram fails to act on nine out of 10 reports of misogyny over its direct messenger. One in 15 DMs sent to women by strangers were abusive or contained violent and sexual images. Of 330 examples reported on Twitter and Instagram, only nine accounts were removed. More than half of those that were reported continued to offend. The Government are letting down survivors and putting countless women and girls at risk of gendered harms, such as image-based sexual abuse—so-called revenge porn—rape threats, doxxing and tech abuse perpetrated by an abusive partner. What more will it take for meaningful change to be made?
I hope the Minister will address those specific omissions. Although I recognise that he was not in his role as the Bill progressed in Committee, he is in the unfortunate position of having to pick up the pieces. I hope he will today give us some reassurances, which I know many of us are seeking.
I must also raise with the Minister once again the issue of online discriminatory abuse, particularly in the context of sport. In oral questions I recently raised the very serious problem of rising discrimination faced not just by players but their families, referees, coaches, pundits, fans and others. I know the hon. Member for Barrow and Furness (Simon Fell) tried to make this point in his contribution. Abuse and harm faced online is not virtual; it is real and has a lasting impact. Labour Members believe it is essential that tech firms are held to account when harmful abuse and criminal behaviour appear on, are amplified by and therefore flourish on their platforms.
There are genuine issues with the Government’s approach to the so-called legal but harmful provisions in the Bill that will, in essence, fail to capture some of the most harmful content out there. We have long called for a more systems-based approach to the Bill, and we need only to look at the research that we have had from Kick It Out to recognise the extent of the issue. Research from that organisation used artificial intelligence to identify violent abuse that falls below the current criminal thresholds outlined in the current draft of the Bill. There is no need for me to repeat the vile language in this place today. We have only to cast our minds back to 2020 and the Euros to recall the disgraceful abuse—and more—targeted at members of the England team to know the realities of the situation online. But it does not have to be this way.
Labour colleagues have repeatedly raised concerns that the current AI moderation practices utilised by the big social media giants are seemingly incapable of adapting to the rapid rate at which new internet-based languages, emojis and euphemisms develop. It is wrong of the Government to pursue an online harms agenda that is so clearly focused on content moderation, rather than considering the business models that underpin those harmful practices. Worse still, we now know that that approach often underpins a wide range of the harmful content that we see online.
The Times recently reported that TikTok users were able to easily evade safety filters to share suicide and self-harm posts by using slang terms and simple misspellings. Some of the content in question had been online for more than a year, despite including direct advice on how to self-harm. TikTok’s community guidelines forbid content that depicts or encourages suicide or self-harm, and yet such content still remains online for everyone to see.
We have concerns that the Government’s current approach will have little impact unless the big firms are held more accountable. What we really need is a consistent approach from the Government, and a commitment to tackling myriad online harms that is fit for the modern age and for emerging tech, too. There is a widespread political consensus on the importance of getting this right, and the Minister can be assured of success if only his Department is prepared to listen.
It is a pleasure to serve under your chairmanship, Mr Dowd. This is my first appearance as a Minister in Westminster Hall, and your first appearance in the Chair, so we are both making our debuts. I hope we have long and successful reigns in our respective roles.
It is a great pleasure to respond to the debate secured by my right hon. Friend the Member for East Hampshire (Damian Hinds) and to his excellent opening speech. He feels strongly about these issues—as he did both in Government and previously as a member of the Digital, Culture, Media and Sport Committee—and he has spoken up about them. I enjoyed working with him when he was a Minister at the Home Office and I chaired the prelegislative scrutiny Committee, which discussed many important features of the Online Safety Bill. One feature of the Bill, of course, is the inclusion of measures on fraud and scam advertising, which was a recommendation of the Joint Committee. It made my life easier that, by the time I became a Minister in the Department, the Government had already accepted that recommendation and introduced the exemption, and I will come on to talk about that in more detail.
My right hon. Friend, the hon. Member for Pontypridd (Alex Davies-Jones) and other Members raised the case of Molly Russell, and it is important to reflect on that case. I share the sentiments expressed about the tragedy of Molly’s death, its avoidable nature and the tireless work of the Russell family, and particularly her father, Ian Russell, whom I have met several times to discuss this. The Russell family pursued a very difficult and complicated case, which required a huge release of evidence from the social media companies, particularly Instagram and Pinterest, to demonstrate the sort of content to which Molly Russell was exposed.
One of the things Ian Russell talks about is the work done by the investigating officers in the coroner’s inquest. Tellingly, the inquest restricted the amount of time that people could be exposed to the content that Molly was exposed to, and ensured that police officers who were investigating were not doing so on their own. Yet that was content that a vulnerable teenage girl saw repeatedly, on her own, in isolation from those who could have helped her.
When online safety issues are raised with social media companies, they say things like, “We make this stuff very hard to find.” The lived experience of most teenagers is not searching for such material; it is such material being selected by the platforms and targeted at the user. When someone opens TikTok, their first exposure is not to content that they have searched for; it is to content recommended to them by TikTok, which data-profiles the user and chooses things that will engage them. Those engagement-based business models are at the heart of the way the Online Safety Bill works and has to work. If platforms choose to recommend content to users to increase their engagement with the platform, they make a business decision. They are selecting content that they think will make a user want to return more frequently and stay on the platform for longer. That is how free apps make money from advertising: by driving engagement.
It is a fair criticism that, at times, the platforms are not effective enough at recognising the kinds of engagement tools they are using, the content that is used to engage people and the harm that that can do. For a vulnerable person, the sad truth is that their vulnerability will probably be detected by the AI that drives the recommendation tools. That person is far more likely to be exposed to content that will make their vulnerabilities worse. That is how a vulnerable teenage girl can be held by the hand—by an app’s AI recommendation tools—and walked from depression to self-harm and worse. That is why regulating online safety is so important and why the protection of children is so fundamental to the Bill. As hon. Members have rightly said, we must also ensure that we protect adults from some of the illegal and harmful activity on the platforms and hold those platforms to account for the business model they have created.
I take exception to the suggestion from the hon. Member for Pontypridd that this is a content-moderation Bill. It is not; it is a systems Bill. The content that we use, and often refer to, is an exemplar of the problem; it is an exemplar of things going wrong. On all the different areas of harm that are listed in the Bill, particularly the priority legal offences in schedule 7, our challenge to the companies is: “You have to demonstrate to the regulator that you have appropriate systems in place to identify this content, to ensure that you are not amplifying or recommending it and to mitigate it.” Mitigation could be suppressing the content—not letting it be amplified by their tools—removing it altogether or taking action against the accounts that post it. It is the regulator’s job to work with the companies, assess the risk, create codes of practice and then hold the companies to account for how they work.
There is criminal liability for the companies if they refuse to co-operate with the regulator. If they refuse to share information or evidence asked for by the regulator, a named company director will be criminally liable. That was in the original Bill. The recommendation in the Joint Committee report was that that should be commenced within months of the Bill being live; originally it was going to be two years. That is in the Bill today, and it is important that it is there so that companies know they have to comply with requests.
The hon. Member for Pontypridd is right to say that the Bill is world-leading, in the sense that it goes further than other people’s Bills, but other Bills have been enacted elsewhere in the world. That is why it is important that we get on with this.
The Minister is right to say that we need to get on with this. I appreciate that he is not responsible for the business of this House, but his party and his Government are, so will he explain why the Bill has been pulled from the timetable next week, if it is such an important piece of legislation?
As the hon. Lady knows, I can speak to the Bill; I cannot speak to the business of the House—that is a matter for the business managers in the usual way. Department officials—some here and some back at the Department—have been working tirelessly on the Bill to ensure we can get it in a timely fashion. I want to see it complete its Commons stages and go to the House of Lords as quickly as possible. Our target is to ensure that it receives safe passage in this Session of Parliament. Obviously, I cannot talk to the business of the House, which may alter as a consequence of the changes to Government.
On that point, will the Minister assure us that he will push for the Bill to come back? Will he make the case to the business managers that the Bill should come back as soon as possible, in order to fulfil his aim of having it pass in this Session of Parliament?
As the hon. Lady knows, I cannot speak to the business of the House. What I would say is that the Department has worked tirelessly to ensure the safe passage of the Bill. We want to see it on the Floor of the House as quickly as possible—our only objective is to ensure that that happens. I hope that the business managers will be able to confirm shortly when that will be. Obviously, the hon. Lady can raise the issue herself with the Leader of the House at the business statement tomorrow.
Could the Minister address the serious issue raised my hon. Friend the Member for Hexham (Guy Opperman)? There can be no excuse for search engines to give a prominent place, or indeed any place, to fake Aviva sites—scamming sites—once those have been brought to their attention. Likewise, unacceptable scam ads for Aviva, Martin Lewis or whoever are completely avoidable if decent checks are in place. Will the Government address those issues in the Bill and in other ways?
I am grateful to my hon. Friend. The answer is yes, absolutely. It was always the case with the Bill that illegal content, including fraud, was in scope. The question in the original draft Bill was that that did not include advertising. Advertising can be in the form of display advertising that can be seen on social media platforms; for search services, it can also be boosted search returns. Under the Bill, known frauds and scams that have been identified should not appear in advertising on regulated platforms. That change was recommended by the Joint Committee, and the Government accepted it. It is really important that that is the case, because the company should have a liability; we cannot work just on the basis that the offence has been committed by the person who has created the advert and who is running the scam. If an intermediary platform is profiting out of someone else’s illegal activity, that should not be allowed. It would be within Ofcom’s regulatory powers to identify whether that is happening and to see that platforms are taking action against it. If not, those companies will be failing in their safety duties, and they will be liable for very large fines that can be levied against them for breaching their obligations, as set out in the Online Safety Bill, which can be up to 10% of global revenues in any one year. That power will absolutely be there.
Some companies could choose to have systems in place to make it less likely that scam ads would appear on their platforms. Google has a policy under which it works with the Financial Conduct Authority and does not accept financial product advertising from organisations that are not FCA accredited. That has been quite an effective way to filter out a lot of potential scam ads before they appear. Whether companies have policies such as that, or other ways of doing these things, they will have to demonstrate to Ofcom that those are effective. [Interruption.] Does my hon. Friend the Member for Hexham (Guy Opperman) want to come in on that? I can see him poised to spring forward.
I would like to touch on some of the other issues that have been raised in the debate. The hon. Member for Leeds East (Richard Burgon) and others made the point about smaller, high-risk platforms. All platforms, regardless of size, have to meet the illegal priority harm standard. For the worst offences, they will already have to produce risk assessments and respond to Ofcom’s request for information. Given that, I would suspect that, if Ofcom had a suspicion that serious illegal activity, or other activity that was causing serious concern, was taking place on a smaller platform, it would have powers to investigate and would probably find that the platform was in breach of those responsibilities. It is not the case that if a company is not a category 1 company, it is not held to account under the illegal priority harms clauses of the Bill. Those clauses cover a wide range of offences, and it is important—this was an important amendment to the Bill recommended by the Joint Committee—that those offences were written into the Bill so that people can see what they are.
The hon. Member for Pontypridd raised the issue of violence against women and girls, but what I would say is that violence against everyone is included in the Bill. The offences of promoting or inciting violence, harassment, stalking and sending unsolicited sexual images are all included in the Bill. The way the schedule 7 offences work is that the schedule lists existing areas of law. Violence against women and girls is covered by lots of different laws; that is why there is not a single offence for it and why it is not listed. That does not mean that we do not take it seriously. As I said to the hon. Lady when we debated this issue on the first day of Report, we all understand that women and girls are far more likely to be victims of abuse online, and they are therefore the group that should benefit the most from the provisions in the Bill.
The hon. Member for Coventry North West (Taiwo Owatemi) spoke about cyber-bullying. Again, offences relating to harassment are included in the Bill. This is also an important area where the regulator’s job is to ensure that companies enforce their own terms of service. For example, TikTok, which is very popular with younger users, has in place quite strict policies on preventing bullying, abuse and intimidation on its services. But does it enforce that effectively? So far, we have largely relied on the platforms self-declaring whether that is the case; we have never had the ability to really know. Now Ofcom will have that power, and it will be able to challenge companies such as TikTok. I have raised with TikTok as well my concern about the prevalence of blackout challenge content, which remains on that platform and which has led to people losing their lives. Could TikTok be more effective at removing more of that content? We will now have the regulatory power to investigate—to get behind the curtain and to see what is really going on.
I will just touch on a couple of other points that have been raised. My hon. Friend the Member for Barrow and Furness (Simon Fell) and other Members raised the point about the abuse of footballers. The abuse suffered by England footballers after the final of the European championship is a very good example. Some people have been charged and prosecuted for what they posted. It was a known-about risk; it was avoidable. The platform should have done more to stop it. This Bill will make sure that they do.
That shows that we have many offences where there is already a legal threshold, and we want them to be included in the regulatory systems. For online safety standards, it is important that the minimum thresholds are based on our laws. In the debate on “legal but harmful”, one of the key points to consider, and one that many Members have brought up, is what we base the thresholds on. To base them on the many offences that we already have written into law is, I think, a good starting point. We understand what those thresholds are. We understand what illegal activity is. We say to the platforms, “Your safety standards must, at a minimum, be at that level.” Platforms do go further in their terms of service. Most terms of service, if properly enforced, would deal with most of the sorts of content that we have spoken about. That is why, if the platforms are to enforce their terms of service properly, the provisions on traceability and accountability are so important. I believe that that will capture the offences that we need.
My right hon. and learned Friend the Member for Kenilworth and Southam (Sir Jeremy Wright) rightly said—if I may paraphrase slightly—that we should not let the perfect be the enemy of the good. There will always be new things that we wish to add and new offences that we have not yet thought about that we need to include, and the structure of the Bill creates the framework for that. In the future, Parliament can create new offences that can form part of the schedule of priority illegal offences. On priority harms, I would say that that is the stuff that the platforms have to proactively look for. Anything illegal could be considered illegal online, and the regulators could take action against it.
Let me finish by thanking all the Members here, including my hon. Friend the Member for Gosport (Dame Caroline Dinenage), another former Minister. A very knowledgeable and distinguished group of Members have taken part in this debate. Finally, I thank the officials at the Department. Until someone is actually in the Department, they can never quite know what work is being done—that is the nature of Government—but I know how personally dedicated those officials are to the Bill. They have all gone the extra mile in the work they are doing for it. For their sakes and all of ours, I want to make sure that we pass it as soon as possible.
Question put and agreed to.
That this House has considered online harms.