Skip to main content

Online Safety Bill

Volume 830: debated on Thursday 25 May 2023

Committee (9th Day) (Continued)

Schedule 7: Priority offences

Amendment 134

Moved by

134: Schedule 7, page 202, line 9, at end insert—

“Animal cruelty

A1_ An offence under section 4 of the Animal Welfare Act 2006 (unnecessary suffering).A2_ An offence under section 19 of the Animal Health and Welfare (Scotland) Act 2006 (unnecessary suffering).A3_ An offence under section 1 of the Wild Mammals (Protection) Act 1996 (offences).”Member’s explanatory statement

This amendment adds a number of animal welfare offences to the list of priority offences outlined in Schedule 7.

My Lords, this is an unusual group: it has just one amendment—Amendment 134 in the name of my noble friend Lord Stevenson. It has also been signed by the right reverend Prelate the Bishop of St Albans, whom I thank; I know that the right reverend Prelate is currently in a debate in Grand Committee.

This amendment seeks to add animal cruelty offences to the list of priority offences set out in Schedule 7, which would require platforms to proactively identify and remove content that depicts animal cruelty, including torture and death. This content is increasingly common, and it is shocking—films of cats being kicked about as footballs, dogs being set on fire and monkeys being ensnared into plastic bottles with dogs then being set upon them. All this is widely shared and viewed, and none of it is properly addressed by social media companies. These animal cruelty offences clearly meet the criteria of prevalence, risk of harm and severity of that harm, which have been set out and previously used by the Government to justify additions to the list.

I turn first to prevalence. The Social Media Animal Cruelty Coalition database comprises over 13,000 social media links showing animal abuse, collected over the past two years. Social media platforms often fail to remove animal cruelty films when they are reported, despite that being a clear contravention of their policies. In fact, less than 50% of links reported by the coalition since August 2021 have been removed, with predictions of a “rapid proliferation” of animal cruelty footage over the years ahead. This analysis is supported by the RSPCA, which received 756 reports of animal cruelty on social media in 2021, compared with 431 in 2020 and 157 in 2019.

The evidence of harm is also clear. Polling commissioned by the RSPCA five years ago found that nearly one in four of 10 to 18 year-olds had seen animal cruelty on social media sites—a proportion which is very likely to have increased subsequently, given the growth in the prevalence of animal abuse films in recent years. Viewing animal cruelty can cause psychological harm to children, with findings suggesting that,

“There is emerging evidence that childhood exposure to maltreatment of companion animals is associated with psychopathology in childhood and adulthood”.

Viewing animal abuse can also lead to imitative behaviour. Research suggests that children who witness animal cruelty are three to eight times more likely to abuse animals themselves, while those engaging in animal cruelty at a young age are more likely to exhibit abusive and violent behaviour towards people as they grow older.

Amendment 134 supports practical consideration of the effect of policy upon the welfare of animals as sentient beings to fulfil the requirements of the Animal Welfare (Sentience) Act 2022 and help the Government to meet their pledge to

“continue to raise the bar”

for animal welfare in the UK.

The adoption of the measures outlined in Amendment 134 would be a popular move. Thousands upon thousands of people have written in to make the Government aware that they want to see this modest addition, and they are supported by a wide range of animal welfare charities, including the RSPCA, the Born Free Foundation and the Humane Society International.

The Bill Minister in the other place stated that the addition of animal cruelty offences

“deserves further consideration as the Bill progresses through its parliamentary stages”.—[Official Report, Commons, 12/7/22; cols. 165-166.]

Earlier this year, as a comparator, the Government agreed to add offences under Section 24 of the Immigration Act, relating to illegal immigration, to the priority offences list. At the time, the Parliamentary Under-Secretary of State admitted that

“offences in Section 24 cannot be carried out online”,—[Official Report, Commons, 17/1/23; col. 314.]

but insisted that the inclusion of the offences was justified on the grounds of the damage that online encouragement of illegal immigration could cause to children. This is a helpful reference point and suggests that offences under Section 4 of the Animal Welfare Act—unnecessary suffering would be the reference—are directly commissioned for online content, which is growing in prevalence and causing demonstrative harm to children, should be added to the priority list.

I hope the Minister is sympathetic to Amendment 134 so we can make the necessary progress. I beg to move.

My Lords, I rise to support Amendment 134, tabled by the noble Lord, Lord Stevenson, which was so ably introduced by the noble Baroness, Lady Merron. The Government accepted the Joint Committee’s recommendation that priority offences should be put in the Bill, and that is now contained in Schedules 5, 6 and 7. In particular, Schedule 7 sets out the priority offences. The noble Baroness, Lady Merron, has nailed it in setting out why these animal suffering-related offences fall within the Government’s criteria.

When the Government responded to the Joint Committee, they accepted our recommendation that we should put priority content in the Bill. As the noble Baroness, Lady Merron, said, the criteria are very clearly set out in paragraph 86 of their report:

“The prevalence of such content on regulated services … The risk of harm being caused to UK users by such content; and … The severity of that harm”.

The noble Baroness has absolutely set out how these offences fall within those criteria: the prevalence of these offences; the abuse that is present; the viewing by children and its impact on them; the impact on animal welfare, which would be positive if this content were treated as a priority offence; and the very strong public support.

Of course—the noble Baroness did not quite go here, but I will—there is a massive contrast with the inclusion of the encouragement of immigration offence in Schedule 7. These offences have far greater merit for inclusion in Schedule 7. I very much hope the Minister will accede to what I think is an extremely reasonable amendment.

I thank the noble Baroness for her amendment and the noble Lord, Lord Clement-Jones, for speaking so powerfully, as ever. I very much recognise the harms and horrors of cruelty to animals online or anywhere else. The UK has a proud history of championing and taking action on animal welfare, and the Government are committed to strengthening animal welfare standards and protections.

Our Action Plan for Animal Welfare demonstrates the Government’s commitment to a brighter future for animals both at home and abroad and provides a foundation for conversations on how we can continue to improve animal welfare and conservation in future. I can also reassure your Lordships that this Bill will tackle some of the worst online activities related to animal cruelty.

Amendment 134 seeks to add certain specified animal offences to the list of priority offences in Schedule 7. It is worth reminding ourselves that the Bill will already tackle some of the worst examples of animal cruelty online. This includes, for example, where the content amounts to an existing priority offence, such as extreme pornography, which platforms must prevent users encountering. Equally, where content could cause psychological harm to children, it must be tackled. Where the largest services prohibit types of animal abuse content in their terms of service, the Bill will require them to enforce those terms and remove such content. Improved user reporting and redress systems, as mandated by the Bill, will make it easier for users to report such content.

The Bill, however, is not designed to address every harm on the internet. For it to have an impact, it needs to be manageable for both Ofcom and the companies. For it to achieve the protections envisaged since the start of the Bill, it must focus on its mission of delivering protections for people. Schedule 7 has been designed to focus on the most serious and prevalent offences affecting humans in the UK, on which companies can take effective and meaningful action. The offences in this schedule are primarily focused on where the offences can be committed online—for example, threats to kill or the unlawful supply of drugs. The offences that the noble Baroness proposes cannot be committed online; while that would not stop them from being added for inchoate purposes, the Government do not believe that platforms would be able to take effective steps proactively to identify and tackle such offences online.

Crucially, the Government feel that adding too many offences to Schedule 7 that cannot be effectively tackled also risks spreading companies’ resources too thinly, particularly for smaller and micro-businesses, which would have to address these offences in their risk assessments. Expanding the list of offences in Schedule 7 to include the animal cruelty offences could dilute companies’ efforts to tackle other offences listed in the Bill which have long been the priority of this legislation.

Beyond the Bill, however, the Government are taking a very wide range of steps to tackle animal cruelty. Since publishing the Action Plan for Animal Welfare in 2021, the Government have brought in new laws to recognise animal sentience, introduced additional legislative measures to tackle illegal hare-coursing, and launched the animal health and welfare pathway as part of our agricultural transition plan. We will, of course, continue to discuss these important issues with colleagues at the Department for Environment, Food and Rural Affairs, who lead on our world-leading protections for animals, but, for the reasons I have set out, I am unable to accept this amendment. I therefore hope that the noble Baroness will withdraw it.

My Lords, I am grateful to the Minister for his considered reply, outlining the ways in which he believes the Bill supports where this amendment is going. I am also grateful to the noble Lord, Lord Clement-Jones, for his support. Indeed, it is my view that the criteria have been met for inclusion of these animal welfare offences in this list of priority offences. It is, of course, disappointing that the Minister does not share the view that we have expressed.

Perhaps I could pick up a point from the Minister’s response. It seems to me that something that is illegal offline should also be illegal online. If something is illegal under the various Acts referred to but there is user-to-user content of these animal cruelty films, for example, is the Minister saying that this will be covered by the Bill in its current form?

I note that the Minister has spoken of continuing discussions with Defra, which is very welcome. I am also requesting a meeting to pursue this. It is something on which we could make progress, and I hope that the Minister would be open to that. With that, I beg leave to withdraw the amendment.

Amendment 134 withdrawn.

Amendment 135 not moved.

Amendment 135A

Moved by

135A: Schedule 7, page 203, line 14, at end insert—

“10A_ An offence under section 76 of the Serious Crime Act 2015 (controlling or coercive behaviour in an intimate or family relationship).”Member’s explanatory statement

This amendment adds the specified offence to Schedule 7, with the effect that content amounting to that offence counts as priority illegal content.

As noble Lords will recall from the earlier debate on this issue, His Majesty’s Government take tackling violence against women and girls extremely seriously. This is why we have ensured that the Bill provides vital protections for women and girls, so that they can express themselves freely online without fear of harassment or abuse.

As noble Lords know, the Bill places strong duties on providers regarding illegal content. The Bill takes an approach which protects all users, but the framework accounts for the fact that some offences can disproportionately affect certain people. To that end, we have already listed several priority offences in Schedule 7 that we know disproportionately affect women and girls. These include sexual exploitation, intimate image abuse— including so-called revenge pornography—and extreme pornography.

In addition, I want to be clear that the Bill will also cover content which intentionally encourages priority offences, an issue that was raised as a concern in our previous debate. Paragraph 33 of Schedule 7 has the effect that inchoate offences of encouraging or assisting a priority offence are themselves to be treated as priority offences under the Bill. As a result, for example, where there is content that intentionally or knowingly encourages harassment online, services will have proactive duties in relation to this content.

Furthermore, the Bill will soon—as I mentioned earlier—introduce new intimate image abuse offences to tackle behaviour, such as the sharing of deep-fake images. These new offences will be listed as priority offences, as is already the case for the current revenge pornography offence under Section 33 of the Criminal Justice and Courts Act 2015. These offences are a major milestone for protecting women and girls, and will be introduced to the Bill as soon as possible. They will sit alongside the Bill’s other criminal provisions, such as its offences on cyberflashing, false communications and threatening communications.

Although I appreciate the intention behind Amendments 269 and 270 and look forward to hearing the arguments made by the noble Lords who will speak to them, I remain concerned by the approach suggested to change Clause 167 to a consent-based model rather than the current intent-based approach. We are confident that the offence, as drafted, captures acts of cyberflashing, including when supposedly done “for a joke”—which, of course, it certainly is not. This is because the focus of the offence as drafted remains firmly on the perpetrator’s abhorrent behaviour and not on the actions of the victim, as would happen with a consent-based approach.

I am grateful to the noble Lords who attended the briefing we organised with Professor Penney Lewis of the Law Commission, who explained in detail why it has taken the approach it has in suggesting this drafting. I know that a number of noble Lords who attended found that it was useful and put their minds at rest. However, I look forward to listening to the thoughts of other noble Lords in this debate.

Amendment 271 would require instances of cyberflashing to be reported to the Crown Prosecution Service. The CPS is not the appropriate body to receive such reports, as its role is to prosecute criminal offences which the police have investigated. In addition, the Bill is primarily about introducing new duties on service providers. It requires them to take responsibility for the safety of their users and prevent illegal content and activity appearing on their services in the first place. I welcome the discussion we will have on these amendments and will listen closely to the points raised in them. However, I hope to be able to reassure noble Lords that these amendments are not needed.

Before we have that debate, I will address government Amendment 135A in my name. Although Schedule 7 already contains several offences which we know will go a long way to protect women and girls online, we want to go further to strengthen the protections in the Bill. That is why the Government have, following discussions, tabled an amendment to list controlling or coercive behaviour as a priority offence in Schedule 7. Doing so will require platforms proactively to identify and tackle content which amounts to this offence.

Measures that platforms can take to tackle this behaviour could include increased safety-by-design features aimed to prevent this type of content occurring in the first place, trusted flagger programmes, or options to enable users to report certain behaviour to platforms. As your Lordships will be aware, the offence of controlling or coercive behaviour, like some other priority offences currently listed in Schedule 7 such as harassment and stalking, includes a course of conduct element. I reassure noble Lords that platforms will be required to tackle this type of behaviour to comply with their safety duties in the Bill.

I hope that noble Lords will accept the government amendment, I look forward to the debate, and I beg to move.

My Lords, I will speak to Amendment 270 in the name of my noble friend Lady Featherstone, who I regret to tell the House is still indisposed, and to support Amendments 269 and 271 respectively in the names of the noble Baronesses, Lady Merron and Lady Berridge, which my noble friend also signed up to.

In the real world, if a man flashed his genitals at a woman or a girl in public, this would constitute a criminal offence punishable by up to two years in prison. Rightly so—she has no choice in the matter. He may do it to cause alarm, distress or humiliation, or to obtain sexual gratification. Apparently, he may hope that the girl or woman he flashes to will be overcome with desire for him, although you would be hard-pressed to find many cases of this pipe dream—pun intended—ever becoming a reality. More seriously, however, this behaviour can be a precursor to more serious offences, as happened with the murder of Sarah Everard.

In the online world, many things are done and said which would be totally unacceptable in real life. Therefore, while the motivation for physical flashing is usually to obtain sexual gratification or cause fear in the victim, in the permissive world of online other motivations are mooted—“for a laugh”, in the hope of reciprocal pictures being flashed back at the flasher, or even in the hope of initiating something physical—although in the online world, as well as in the real world, unsolicited images of male genitalia are rarely welcomed by women or girls.

Indeed, the vast majority of women who receive these unsolicited images are not laughing. Research by Professor Clare McGlynn KC at Durham University found that women report feeling violated, threatened, intimidated and harassed. They experience a loss of control, privacy and sexual autonomy. They feel personally targeted. Some women are bombarded with these images across social media and dating apps, so they are intimidated into changing their behaviour online as a result. This is in no one’s interests; it is something the Bill is intended to prevent. Research from Jessica Ringrose and colleagues at UCL found that 76% of teenage girls had been sent unwanted explicit images by their peers and by strangers. The women-orientated online dating platform Bumble produced a survey finding that 48% of millennial women had been cyberflashed in the last year alone.

The Law Commission recommended a new criminal offence of cyberflashing, and its recommendation has been incorporated in Schedule 7. The inclusion of cyberflashing as a potentially criminal offence is to be warmly welcomed. However, the premise of the Bill as it stands is motive based. This means that the prosecution must prove that the sender had the intention to cause distress, alarm or humiliation to the victim, or that sexual gratification was the motive and the sender was reckless as to any distress that might be caused. This means that many forms of cyberflashing, including when men are doing it to “have a laugh” or to show off to their friends, would not be covered, regardless of the harm caused to the victims. Currently, someone who sends a dick pic to a girl “for a laugh” is unlikely to be prosecuted if he thinks, “She should have found it funny too”. Really? If any noble Lord has evidence that most women enjoy unsolicited cyberflashing, let him bring it forward.

I ask the Minister the rhetorical question: how do you prove motivation? The Minister may have been told that many men send these images in the hope of getting nudes or other favours in return, and as a form of sexual gratification. Proving motive will be well-nigh impossible, and the sheer fact of having to prove it to get a conviction will put off police. They are hardly going to waste precious resources prying into senders’ backgrounds. How would you do that? Check out their porn habits, perhaps? This is the difficulty we have seen with the need to prove intent to cause distress in the distribution of intimate images offence. Under these strictures, very few prosecutions would result and girls and women would continue to suffer.

To my mind, and to the minds of all the women’s organisations that responded to the government consultations, it is all the wrong way around. Ultimately, the motivation of the perpetrator has no bearing on the outcome for the victim. They suffer regardless. If alarm, distress, humiliation, et cetera was caused, would the logical solution not be to take steps to prevent it in the first place? Would it not prevent a lot of suffering if the sender were to check that they were not going to cause those effects before sending the dick pics? It is not hard to do, but it may well cause someone who thinks it is a fun prank to send a picture of his willy to think again.

This amendment would send the clear message to men and boys that you have to have consent before you send your image. This would have an educational value, too. It is about learning how to respect women and girls and appreciate that they are different in their thinking from boys and men.

As it stands, sending unsolicited images could have the effect of bullying, intimidating or sexualising women and girls. This is not good enough, and we would be squandering the opportunity to protect women and girls. It is not difficult to ask whether it is okay before you send.

I do not understand the inconsistency of the Law Commission here. In its 2022 report on intimate image abuse, the Law Commission recommended an offence of taking and sharing intimate images without consent, regardless of motivation. It used the same arguments that Amendment 270 uses today, including difficulty in evidencing the intention of the perpetrator. In November 2022 the Government accepted the Law Commission’s recommendations. Surely the same principle applies to cyberflashing.

Amendment 270 has also addressed any circumstances in which use of legitimate images of genitals might be inadvertently caught in the net. These would, of course, not attract prosecution. I will not list them all here: they are in proposed new subsection (7) in the amendment. Young boys would be appropriately dealt with through the youth justice system: it is in no one’s interest to brand them with a criminal record, unless that would be in the public interest.

The argument boils down to priorities. The Bill as it stands prioritises boys’ and men’s rights to be “funny” over girls’ and women’s rights to live free from harassment and abuse. We must stop making women who receive such images responsible and hold those who commit gendered harms to account. I hope the Minister will agree.

My Lords, I support Amendment 135A in the name of the noble Lord, Lord Parkinson. I also support Amendment 269 in the name of the noble Baroness, Lady Merron, and Amendment 270 in the name of the noble Baroness, Lady Featherstone, and have added my name to both. I wish the noble Baroness, Lady Featherstone, good health and hope to see her back soon.

I welcome adding coercive control to Schedule 7 to ensure that content amounting to this offence counts as priority illegal content. Coercive control has a very damaging and long-term impact on mental health and, increasingly, abusers are maintaining their power and hold over victims through digital coercive control, which is like having invisible chains that you cannot break free from. This will send a clear message to tech companies that they must better understand and tackle online domestic abuse and will mean that perpetrators will be held accountable for their actions.

I also welcome the effort by the Government to criminalise cyberflashing. No one should be forced to see images of genitals. This is a growing form of sexual harassment of girls and women. Of course, I acknowledge that young boys and men can also be sent unwanted images. However, the majority of the cases involve images of male genitals being sent by men to women and girls. Very worryingly—as mentioned by the noble Baroness, Lady Burt—research by Professor Jessica Ringrose from 2020 found that 76% of girls aged 12 to 18 had been sent unsolicited nude images of boys or men.

While I am pleased that concerns raised by women and women’s groups have been heard by the Government, the wording in the Bill does not go far enough to protect women and girls from this type of sexual harassment. With the present wording, an offence is based on motive rather than consent, as mentioned by the noble Baroness, Lady Burt. I also thank Professor Clare McGlynn, Bumble and many others who have made a strong case for a consent-based cyberflashing offence rather than the motive-based approach proposed by the Government.

I put my name to these amendments because the offence in its current form will not be effective. It relies on the victims of cyberflashing, who will mostly be women and girls, to prove that the motive or intention in sending the image of genitals was deliberately to cause distress or for sexual gratification, so I ask: why should the onus be put on the victim to prove the sender’s intent when it comes to reporting cyberflashing? I would be grateful if the Minister could respond to this question.

How are women and girls going to prove the motive of the person sending the images, as already mentioned? I would also like the Minister to share his thoughts on that, as it will not be easy. The perpetrator could say “It was a joke” or “I did not do it for sexual gratification”. Then imagine them reporting it to the police, who research already shows do not take offline stalking and harassment seriously. In reality, many women and girls are likely to be turned away by the police because they will say it is hard to prove the motive. Once men realise that they can get away with sending unwanted sexual images, they may then deliberately gaslight women and girls and send further images, safe in the knowledge that they will not have the law used against them.

The current wording in the Bill, which bases the offence on motive and not consent, may well have been adopted because the Law Commission recommended it. However, that does not mean the commission got it right, despite its best intentions. What is even more confusing is that the current approach is out of step with other laws. For example, for intimate image abuse, offences are based on consent, which the Law Commission actually recommended. I am not sure why the Law Commission has not been consistent in its approach, as has been mentioned.

Perhaps it may not have wanted to criminalise young people for misjudged humour. However, I understand that there are CPS guidelines on having the option to be more lenient on younger people when they commit offences. Perhaps guidelines for this offence can be drawn up where they are given a first warning, which is logged by the police, and arrested only if they commit a second offence. Similar warnings exist for harassment, but that first warning must still be counted, even if the perpetrator moves on to a different victim.

I feel that the current approach in the Bill is sexist because it prioritises the entitlement of men to send images over a woman’s entitlement not to receive those images. The current approach also does not consider the impact on women and girls of receiving such unwanted images. It can cause significant emotional distress, with feelings of violation, it can be traumatic due to past trauma, and it can make them feel very unsafe. The current approach will also contribute to the further normalisation of sexual harassment, and inadvertently help to perpetuate a culture where men will feel entitled to invade the boundaries of women without any consequences.

Research by Bumble has found alarming statistics, including that: 35% of women have received an unsolicited nude image at work; 27% of women have received an unsolicited nude image when on public transport; one in five women have received unsolicited nude images when walking down the street; and almost half of 18 to 24 year-olds have received a sexual image that they did not consent to. Let us make the law clearer and stronger by basing the offence on consent. If anyone has not consented to receiving sexual images then it should be an offence. I therefore urge that Clause 167 is left out and replaced with the new clause as proposed by the noble Baroness, Lady Featherstone.

My Lords, I am grateful to noble Lords who have added their name to my Amendment 271, which arose out of concerns that there are now seemingly several offences that laudably aim to protect women but are not being enforced effectively. The most notable in this category is the low rate of rape cases that are prosecuted and lead to convictions. The amendment is not affected in theory by the definition of cyberflashing, whether it is in the form recommended by the Law Commission, that of specific intent, rather than being based on consent. However, in practice, if it remains in that specific intent form, then the victim will not be required to go to court. Therefore, in practice the amendment would be more effective if the offence remained on that basis. However, even if the victim on that basis does not need to go to court, someone who has been cyberflashed is, as other noble Lords have mentioned, unlikely to go to the police station to report what has happened.

This amendment is designed to put an obligation on the providers of technology to provide a reporting mechanism on phones and to collate that information before passing it to the prosecuting authorities. The Minister said that there are various issues with how the amendment is currently drafted, such as “the Crown Prosecution Service” rather than “the police”, and perhaps the definition of “providers of internet services” as it may be a different part of the tech industry that is required to collate this information.

Drawing on our discussions on the previous group of amendments regarding the criminal law here, I hope that my noble friend can clarify the issues of intent, which is mens rea and different from motive in relation to this matter. The purpose of the amendment is to ensure that there will be resources and expertise from the technology sector to provide these reporting mechanisms for the offences. One can imagine how many people will report cyberflashing if they only have to click on an app, or if their phone is enabled to retain such an image, since some of them disappear after a short while. You should be able to sit on the bus and report it. The tech company would then store and collate that, potentially in a manner that it would become clear. For instance—because this happens so much as we have just heard—if six people on the 27 bus multiple times a week report that they have received the same image, that would prompt the police to get the CCTV from the bus company to identify who this individual is if the tech company data did not provide that specificity. Or, is someone hanging out every Friday night at the A&E department and cyberflashing as they sit there? This is not part of the amendment, but such an app or mechanism could also include a reminder to change the security settings on your phone so that you cannot be AirDropped.

I hope that His Majesty’s Government will look at the purpose of this amendment. It is laudable that we are making cyberflashing an offence, but this amendment is about the enforcement of that offence and will support that. Only with such an easy mechanism to report it can what will be a crime be effectively policed.

My Lords, I, too, wish the noble Baroness, Lady Featherstone, a very speedy recovery. Her presence here today is missed, though the amendments were very ably moved by the noble Baroness, Lady Burt. Having worked in government with the noble Baroness, Lady Featherstone, I can imagine how frustrated she is at not being able to speak today on amendments bearing her name.

As my noble friend said, this follows our debate on the wider issues around violence against women and girls in the online world. I do not want to repeat anything that was said there, but I am grateful to him for the discussions that we have had since. I support the Government in their introduction of Amendment 135A and the addition of controlling or coercive behaviour to the priority offences list. I will also speak to the cyberflashing amendments and Amendment 271, introduced by my noble friend Lady Berridge.

I suspect that many of us speaking in this debate today have had briefings from the wonderful organisation Refuge, which has seen a growing number of cases of technology-facilitated domestic abuse in recent years. As a result of this, Refuge pioneered a specialist technology-facilitated domestic abuse team, which uses expertise to support survivors and to identify emerging trends of online domestic abuse.

I draw noble Lords’ attention to a publication released since we debated this last week: the National Police Chiefs’ Council’s violence against women and girls strategic threat risk assessment for 2023, in which a whole page is devoted to tech and online-enabled violence against women and girls. In its conclusions, it says that one of the key threats is tech-enabled VAWG. The fact that we are having to debate these specific offences, but also the whole issue of gendered abuse online, shows how huge an issue this is for women and girls.

I will start with Amendment 271. I entirely agree with my noble friend about the need for specific user reporting and making that as easy as possible. That would support the debate we had last week about the code of practice, which would generally require platforms and search engines to think from the start how they will enable those who have been abused to report that abuse as easily as possible, so that the online platforms and search engines can then gather that data to build up a picture and share it with the regulator and law enforcement as appropriate. So, while I suspect from what the Minister has said that he will not accept this amendment, the points that my noble friend made are absolutely necessary in this debate.

I move on to the cyberflashing amendment. It has been very ably covered already, so I do not want to say too much. It is clear that women and girls experience harms regardless of the motives of the perpetrator. I also point out that, as we have heard, motivations are very difficult to prove, meaning that prosecutions are often extremely unlikely.

I was very proud to introduce the amendments to what became the Domestic Abuse Act 2021. It was one of my first contributions in this House. I remember that, in the face of a lockdown, most of us were working virtually. But we agreed, and the Government introduced, amendments on intimate image abuse and revenge porn. Even as I proposed those amendments and they were accepted, it was clear that they were not quite right and did not go far enough. As we have heard, for the intimate image abuse proposals, the Law Commission is proposing a consent-based image abuse offence. Can my noble friend be even clearer—I am sorry that I was not able to attend the briefing—about the distinction between consent-based intimate image abuse offences and motive-based cyberflashing offences, and why the Government decided to make it?

I also gently point out to him that I know that this is complicated, but we are still waiting for drafting of the intimate image abuse offences. We are potentially running out of time. Perhaps we will see them at the next stage of the Bill—unless he reveals them like a rabbit out of a hat this afternoon, which I suspect is not the case. These are important offences and it will be important for us to see the detail so that we can scrutinise them properly.

Finally, in welcoming the Government’s amendment on coercive control, I say that it is generally poorly understood by technology companies. Overall, the use of the online world to perpetrate abuse on women and girls, particularly in the domestic abuse context, is certainly being understood more quickly, but we are all playing catch-up in how this happens while the perpetrators are running ahead of us. More can be done to recognise the ways that the online world can be used to abuse and intimidate victims, as the Government have recognised with this amendment and as the noble Baroness, Lady Gohir, said. It is very necessary in debating the Bill. I look forward to hearing the Minister’s remarks at the end of this debate.

My Lords, I am pleased to add my name to Amendment 271, to which the noble Baroness, Lady Berridge, has spoken so comprehensively. I too heard the criticisms made by the Minister, but they do not take away from the intention behind that amendment, which is really important. Like others, I hope to convey the well wishes from around the House to the noble Baroness, Lady Featherstone. I am grateful the noble Baroness, Lady Burt, for introducing this whole section with a great deal of clarity.

I will not repeat what has been said about the traumas involved, because that has been covered. It seems that one of the real difficulties is how people in receipt of these ghastly images and experiences can report them and get something done.

A study from UCL and the University of Kent in 2021 found that 51% of people who had received unwanted sexual content without their consent did nothing about reporting it and one-third of people thought that reporting did not work anyway; there was no point, and it would add to their own trauma. The other difficulty with reporting, which I do not think has been touched on yet, is that some of the perpetrators of cyberflashing do so with false identities. They are nearly all men who take on false identities, so tracing them is particularly difficult.

In listening to the debate, I welcome the government amendment, because, like others have said, coercive and controlling behaviour has been very poorly understood across the whole of society until very recently. It is to the credit of the noble Baroness, Lady Morgan of Cotes, that awareness has risen. People have seen what has been happening in front of their own eyes, but they never noticed it previously.

It seems that this difficulty, again, goes back to the debate we had previously on the need for some form of complaints system, whereby people can report easily to a complaints system and have confidence in that complaint being collated with other similar complaints, which would then allow action to be taken.

My Lords, I welcome government Amendment 135A and the inclusion in the Bill of the new offence of cyberflashing.

I understand why questions have been raised, and indeed arguments advanced, about the way in which this offence has been crafted and whether the onus should be on the perpetrator or the victim of such a crime. I tend to come down on the side of the Law Commission and what is in the Bill as it stands. I have thought about it, and I have listened carefully and read the various briefings. I have weighed it up and found it quite hard at times to make my mind up. On balance. I would stick with what is in the Bill.

The noble Baroness, Lady Burt, said something I am not sure is correct. She said that, in the way it is currently included in the Bill, it will be the responsibility of women and girls to show that they are harmed by this. My understanding is that the opposite of that is true; they just need to report it and the responsibility sits on the shoulders of the person distributing these images. I am sure my noble friend the Minister will be able to confirm that—or otherwise—when he comes to wind up.

The only other thing in that context which I will add—I think this has been touched on by others—is that it is important, in introducing this as a new offence, that we ensure that we educate young people away from what I have been told has now become quite a common practice as a way of expressing interest in one another. I do not think that, just because it is happening, we should tolerate it and say, “Okay, well that’s all right then”. I do not think that it is right, and we should be much clearer about advising and explaining to our young people why that is not the best way to express any kind of interest in anyone, whether they are of the opposite sex or of the same sex. I also understand this is a common practice among gay men as well. I just think that taking photographs of one’s genitals and distributing them to other people is not a good idea—that is my argument.

My noble friend Lady Berridge’s Amendment 271 is an interesting proposal. What I found compelling about it was her argument that we will introduce a new offence in the Bill, and, specifically in that context, she proposes a way to report receiving these pictures when people do not want to receive them, and to do so in a way that makes it easier for the police to see new trends and incidences emerging. It is then more likely that they would be able to pursue a perpetrator. However, although I hope my noble friend the Minister will consider this carefully, I do not know what the tech companies would argue about their position, having been given that responsibility. So I am interested in her proposal and think that it is worth proper consideration, but I say that without the benefit of an understanding of where the tech firms are on it. But, overall, I welcome what the Government propose and offer my support.

My Lords, alongside others, I very much welcome government Amendment 135A and how the Minister introduced it. But there is a big “but” as regards much of the rest of what he said. I very much welcome that this will be included as a priority offence, and I join other noble Lords in that—but there is still a view out there that women and girls are being short-changed by the Bill. The other day, we had a debate on the Violence Against Women and Girls Code of Practice, and the same feeling about the cyberflashing offence was very much there, which is why I strongly support Amendments 269 and 270, which would alter the nature of Clause 167.

The equivalence between online and offline was mentioned by my noble friend Lady Burt—I also regret that my noble friend Lady Featherstone has not been with us for some time—and she introduced extremely clearly and well that this kind of cyberflashing offence leads to other and worse offences in both the offline and the online worlds, as we have seen.

Like others, I am in debt to Professor McGlynn for her analysis of the proposed offence. We had evidence from UCL and the Bumble survey, and there is of course also the YouGov survey that shows that nearly half of young women aged 18 to 24 have been sent an unwanted penis image—that is an extraordinary figure. So all of the evidence of this offence is there.

We have heard differing views on the offence—the noble Baronesses, Lady Berridge and Lady Stowell, are on the side of the status quo on the nature of the offence. The fact is that the Government’s proposal covers only some cases of cyberflashing, where motivated by a desire to cause “distress” or for “sexual gratification” with recklessness about causing distress.

I am not a criminal lawyer, but, in answer to the noble Baroness, Lady Stowell, you have to show intent beyond reasonable doubt—that is where the onus on the victim arises. There is a very high barrier in a criminal offence. My noble friend made that point clearly, and the analysis of the noble Baroness, Lady Gohir, was absolutely right that, of course, if you make it a criminal offence, where the issue is about consent rather than intent, you can always be more lenient when an offence does not seem so egregious, where there is clear misunderstanding or where there are other mitigating factors—that is what happens under the criminal law.

This is all about proving the motive—that is the real problem; it is technically called mens rea or the intent—so we need a clear message, as my noble friend said. I believe that we are squandering an opportunity here; it could be a real opportunity for the Government to send a much more powerful signal that the Bill is about protecting women and girls, despite the very welcome addition of abuse under Amendment 135A.

The noble Baroness, Lady Berridge, put her point extremely well. She made a very good case for another addition to the armoury of user-empowerment tools. Although I disagree with her about the ambit of the cyberflashing offence, she proposed something which would be extremely useful to add.

We ought to take heed from the noble Baroness, Lady Morgan, given her legal background. She referred to the Law Commission’s rather inconsistent approach. The very welcome proposal to extend the way that revenge porn events will apply seems to be extremely sensible. I am afraid that, in the battle of the professors, I prefer what Professor McGlynn is saying to what Professor Lewis is saying; that is the choice that I have made.

Following the way that the noble Baroness, Lady Gohir, talked about this issue, we need to call men to account. That is something that the Government need to pay heed to.

That is all I want to say on this subject. This is not just a technical aspect—it is not just a question of whether or not we accept the Law Commission’s advice in this particular case—it is about the difficulty that young women, in particular, will find in enforcing this offence, and we need to be very mindful of that.

My Lords, this has been a very good and well-focused debate. We have focused deeply on the particular issue I will focus on in my speech about why there is a difference in the Government’s approach to what seem to be, on the surface, very similar instances of the evidence we have all been looking at, and are convinced by, that, in some way, the internet, as currently constructed, is gendered against women. Something must be done about that.

I am grateful to the Minister for introducing this group. Amendment 135A is very well-drafted and appropriate for what we are doing. I have very little to comment on it. It is a difficult area, but I am glad that the Government are putting their money where their mouth was on this issue and that we are seeing some action coming forward.

My noble friend Lady Merron would have been speaking to our amendments in this group, but, unfortunately, she has been taken off for some treatment to her leg, which seems to have been slightly twisted. They follow on from the meeting that the Minister mentioned with Professor Lewis from the Law Commission, when she very expertly introduced this whole topic, explaining very carefully, and with great care and concern, the reasons why the Law Commission has proposed, and the Government have accepted, that the new law to be brought in on cyberflashing needs to be different. My problem with that was that it seemed, by the end of it, that the rationale for doing it differently from other offences of a similar nature and type was more to do with the fact that there were good reasons for having the ability to send dick pics—let us call them that, even though it is a horrible term to use.

It is sometimes necessary for pictures to be sent around, and examples of that were given. For example, in a medical situation in which a doctor, perhaps during the Covid epidemic, wanted to know about a patient’s particular problem in the genital area, a picture would be helpful in diagnosis. Sending that should not be made illegal. Other reasons were given. The argument was good, but not sufficient to trump the need to have in place a set of laws relating to the way in which the internet treats women and girls in this dimension that does not come from different directions and is not confusing but complementary.

That is why we have tabled Amendment 269, to probe a little further why the Government have opted for this approach, and the argument has been well made. It has been further picked up in the amendment from the noble Baroness, Lady Featherstone, which we also signed—and I send my best wishes to her for a speedy recovery. The way in which the noble Baroness, Lady Burt, introduced it made it very clear that the direction that she was coming from was that this was not a good thing, and I want to cover some of the ground on that.

We do not need to spend too much time on this, as the issue is very narrow; it is a question of choosing one of two options about how we bring this forward—whether it is consent based or whether to make it an offence on specific evidence from the individual who has sent the messages as to why they did it. For all the reasons, it seems to me that a consent-based offence is a much more comprehensive way in which to approach the issue. It covers all forms—you do not have to distinguish between them. It is more straightforward and much easier to understand, and that is a really important point. If, in passing the law, we are simply trying to attack bad behaviour, we are not going far enough. We want the laws to be illustrative and for people to learn from them. If it is a simple law and a simple concept of consent having to be required, and if not it is illegal, that opens up the opportunity to make much more of this than we previously have.

The evidence is there that this is a widespread and unpleasant happening. AirDropping seems to be one of the causes of that, and it is something that we do not seem to address. We are attacking the end result, which is the distress and unpleasantness caused, but have the Government thought about whether AirDropping could be looked at in more detail, so that the process was more of an offence, as it is one that seems to cause harm? If you are in an enclosed space, such as on a bus or another form of public transport, and somebody sends you an unsolicited image because you happen to have your phone on and have not barred AirDropping, that distress is compounded by the fact that somewhere close to you is somebody sending you images that you do not want to receive in a way that is meant to threaten or, as we have heard, might be to attract your attention in a positive way but, anyway, is unwanted. If we are not attacking that, why not? That seems an open goal, in some senses.

For all the reasons that we have this coming forward, we have arrived at a situation in which a large number of people who are affected by this and want to see the law change do not like the solution. It is inconsistent. The Government do not have to accept Law Commission recommendations. In fact, I have spent quite a lot of my time when not involved in this Bill trying to redress a decision taken by the Government not to implement a Law Commission proposal for a law to prevent a particularly egregious Victorian piece of legislation that is causing a lot of distress among people who take out high-cost loans, but we have not been able to make progress. The Government do not want to move. The Law Commission has developed a Bill, and I have put it in as a Private Member’s Bill but it has still not got through. I will have one more go before I give up. But we do not have to follow the Law Commission; we still have choices to make in relation to this. It could be a consent-based offence. For all the reasons that others have given—and I support them—it is a pity that that is not the case.

I signed up to the amendment in the name of the noble Baroness, Lady Berridge, because it is a very helpful one that we should think carefully about. It seems to get around the question about how to ensure that the process of reporting and the subsequent carrying forward of the cases that are made is picked up by those who would perhaps need a bit of support in that area. That is something we should support as well. I look forward to the Minister’s response.

I join all those who have sent our best wishes to the noble Baroness, Lady Featherstone, for a speedy recovery. I am grateful that noble Lords were able to take forward her points in this debate.

As I said at the outset, protecting women and girls online is an objective of this Bill, which is reflected by the number of priority offences we have included that disproportionately affect women and girls. This includes the addition of the controlling or coercive behaviour offence, and I am grateful for the support from across the Committee for that amendment. This, in addition to the new cyberflashing offence and other criminal law reforms, demonstrates our continued commitment to increase the safety of women and girls online.

The amendments tabled by my noble friend Lady Berridge and the noble Baronesses, Lady Featherstone and Lady Gohir, relate to cyberflashing. The new cyberflashing offence, alongside the package of offences in this Bill, will bring significant benefit for women and girls across the UK, too many of whom have been subjected to the distressing behaviour that noble Lords have spoken about in this debate. We share the aim of noble Lords who have spoken in favour of those amendments to ensure that this offence is effective at stopping this behaviour.

Regarding Amendments 269 and 270, I want to reassure your Lordships that the intent-based approach in Clause 167 has been tested extensively both by the Law Commission and subsequently by His Majesty’s Government. The noble Lord, Lord Stevenson, is correct that we do not automatically agree with what it says, but we do take the commission’s expert views very seriously. The Crown Prosecution Service has stated that it has no concerns about using the offence that has been drafted to bring perpetrators to justice. Indeed, it strongly supported the inclusion of the “sexual gratification” element, which would, according to the Crown Prosecution Service, enable it to prosecute this offence more effectively.

The offence will capture many instances of cyberflashing, such as where pictures are sent to strangers via AirDrop in a crowded railway carriage. I agree with the points noble Lords raised about the settings and the simple technological change that, at an operator level, could make a big difference here. We are well aware of the concern set out by the noble Baroness, Lady Burt of Solihull, that an intent-based approach may let perpetrators off the hook if they send images supposedly for a laugh. We do not accept that view. The courts will, in the normal way, consider all the evidence to determine whether the elements of the offence have been made out. It is of course never on the victim to have to prove the perpetrator’s intention; it is for the police to investigate alleged offences and for the Crown Prosecution Service to establish the perpetrator’s intention in court.

I draw noble Lords’ attention to the inclusion of the word “humiliation” in Clause 167. This will catch many supposedly joke motives, since the perverted form of humour in these instances is often derived from the victim’s humiliation, alarm or distress. This offence has been crafted following calls, including by victims’ groups, to include an intention to cause the victim humiliation.

My noble friend Lady Morgan of Cotes said she was unable to attend the briefing we organised with the Law Commission so, for the benefit of those who were not able to join, let me reassure noble Lords that Clause 167 is based on the offence proposed by the Law Commission, which held an extensive public consultation with victims, the police, prosecutors and academics, and was drafted following further engagement with the police and the Crown Prosecution Service.

The Law Commission, as Professor Lewis set out in that briefing for your Lordships, did consider a consent-based approach, and its final report highlights the significant concerns expressed by respondents to its consultation. A consent-based offence, as the commission found, would result in overcriminalisation, capturing behaviour that does not warrant criminal sanction. For example, as Professor Lewis outlined at the meeting, it could capture a patient sending their doctor an image of their genitals for medical reasons. I take the point that the noble Lord, Lord Stevenson, just made interrogating that. The commission found that it would also criminalise misjudged attempts at intimacy where there was, for example, no genuine intention to cause harm or upset. It has looked at these issues.

Requiring a specific intent is not new and is taken in line with other non-contact sexual offences, including “in person” flashing—the offence of exposure. The police and Crown Prosecution Service are very familiar with these offences and with the evidence that is needed in court to prove the required intent. Crucially an offence based on a lack of consent would shift the focus away from the actions and intentions of the perpetrator to the victim and what they may or may not have done. This would be likely to result in a victim’s previous sexual or private behaviour being interrogated in open court. We do not want victims of this behaviour to be put under that sort of pressure. We want the focus to be fully on the perpetrator’s actions and intentions. The provisions in the Bill have been carefully targeted to protect victims from the intrusive and disturbing behaviour that noble Lords have set out, not to subject them to an unnecessary and distressing interrogation of their private lives.

Changing the consent test to reasonable belief that the defendant would have consented, in order to avoid criminalisation, would not work. Applying this test would mean that it would be much easier for genuinely harmful and culpable cyberflashing to escape conviction. For example, it would make it easier for a defendant to make an excuse, such as claiming that they reasonably believed that a person had consented to see a picture because they were on a particular dating app or, as was discussed in the briefing with the Law Commission, claiming that the victim had smiled back at them in a meaningful way on a train. They are not, perhaps, strong defences, but they are not—I am sure—ones that noble Lords would want to encourage through the drafting of this amendment. We are confident that an intent approach is the most appropriate way to frame this offence and that it ensures that the criminal law is workable, so that we can bring perpetrators to justice.

I am sorry to interrupt the Minister in his flow. Just to go back a little bit, the amendment in the name of the noble Baroness, Lady Featherstone, attempted to resolve the questions about where it was legitimate for material of the nature that he has been describing to be circulated. Would be accept that that approach has some merit? If so, then I go on to ask: is the decision still to go with intent rather than content for reasons other than relating to that particular point?

It is a very narrow point, but it is important in terms of the overall approach that we are taking on this. The Minister very accurately described the reasons that the Law Commission came up with for moving back to an intent-based rather than content-based approach. I wanted to ask him to check whether the wording in the amendment that we signed up to, in the name of the noble Baroness, Lady Featherstone—ably introduced by the noble Baroness, Lady Burt, and spoken to by many people around the Chamber—would cover off those points where there is legitimate reason for this material to be circulated. I used an unfortunate phrase that I will not repeat. Are the Government happy to accept that it is possible to get around that objection by the Law Commission by making legitimate those particular explicit reasons for those pictures being circulated? I make that point only to get an admission at the Dispatch Box that the Government could get round the issue that has been mentioned, but they are still deciding to go for an intent-based approach for other reasons, which the Minister has just adumbrated and which I accept are genuine.

It may be helpful to the Minister to just repeat the terms of the amendment itself. If you reverse the point and do not have intent, you would still need to consider

“Whether a belief is reasonable”

which

“is to be determined having regard to all the circumstances, including any steps that A has taken to ascertain whether B consents”.

The noble Lord, Lord Stevenson, has absolutely hit the mark on this. This would not lead to terrible consequences and injustices because of this particular qualification.

If the Minister could write to me on the point once he has had advice, or perhaps inspiration from the Box, that would be very helpful.

I will certainly do so. It requires flicking through a number of amendments and cross-referencing them with provisions in the Bill. I will certainly do that in slower time and respond.

We think that the Law Commission, which looked at all these issues, including, I think, the questions put by the noble Lord, has done that well. We were satisfied with it. I thought its briefing with Professor Penney Lewis was useful in exploring those issues. We are confident that the offence as drafted is the appropriate one.

My noble friend Lady Morgan and others asked why both the Law Commission and the Government are taking a different approach in relation to intimate image abuse and to cyberflashing. We are taking action to criminalise both, but the Law Commission recommended different approaches in how to criminalise that behaviour to take into account the different actions of the perpetrator in each scenario. Sharing an intimate image of a person without their consent is ipso facto wrongful, as it is a violation of their bodily privacy and sexual autonomy. Sending a genital image is not ipso facto wrongful, as it does not always constitute a sexual intrusion, so greater additional culpability is required for that offence. To give an example, sending a photograph of a naked protestor, even without the consent of the recipient, is not always harmful. Although levels of harm resulting from behaviours may be the same and cause the same levels of stress, the criminal law must consider whether the perpetrator’s behaviour was sufficiently culpable for an offence to have been committed. That is why we think the intent approach is best for cyberflashing but have taken a different approach in relation to intimate image abuse.

I thank my noble friend for that explanation, which is very helpful and there is a lot in his reply so far that we will have to bottom out. Is he able to shed any light at all on when we might see the drafting of the intimate image abuse wording because that would be helpful in resolving some of the issues we have been debating?

I cannot give a precise date. The Committee knows the dates for this Committee are a moveable feast, but we have been having fruitful discussions on some of the issues we have already discussed—we had one yesterday with my noble friend. I appreciate the point she is making about wanting to see the drafting in good time before Report so that we can have a well thought through debate on it. I will certainly reiterate that to the usual channels and to others.

Amendment 271 additionally seeks to require companies in scope to provide systems which enable users to report incidents of cyberflashing to platforms. Clauses 16 and 26 already require companies to set up systems and processes which allow users easily to report illegal content, and this will include cyberflashing. This amendment therefore duplicates the existing requirement set out in the Bill. Amendment 271 also requires in scope companies to report cyberflashing content to the Crown Prosecution Service. The Bill does not place requirements on in scope companies to report discovery of illegal content online, other than in the instances of child exploitation and abuse, reflecting the seriousness of that crime and the less subjective nature of the content that is being reported in those scenarios.

The Bill, which has been developed in consultation with our partners in law enforcement, aims to prevent and reduce the proliferation of illegal content and activity in the first place and the resulting harm this causes to so many. While the Bill does not place any specific responsibilities on policing, our policing partners are considering how best to respond to the growing threat of online offences, as my noble friend Lady Morgan noted, in relation to the publication last week of the Strategic Threat and Risk Assessment on Violence Against Women and Girls. Policing partners will be working closely with Ofcom to explore the operational impact of the Bill and make sure it is protecting women and girls in the way we all want it to.

I hope that helps noble Lords on the issues set out in these amendments. I am grateful for the support for the government amendment in my name and hope that noble Lords will be content not to move theirs at this juncture.

Amendment 135A agreed.

Amendment 136 not moved.

Amendments 136A to 136C

Moved by

136A: Schedule 7, page 204, line 31, leave out from “under” to end of line 32 and insert “any of the following provisions of the Immigration Act 1971—

(a) section 24(A1), (B1), (C1) or (D1) (illegal entry and similar offences);(b) section 25 (assisting unlawful immigration).”Member’s explanatory statement

This amendment adds the specified offences under section 24 of the Immigration Act to Schedule 7, with the effect that (amongst other things) content amounting to encouraging those offences (as per the Serious Crime Act 2007) counts as priority illegal content.

136B: Schedule 7, page 204, line 32, at end insert—

“22A_ An offence under section 2 of the Modern Slavery Act 2015 (human trafficking).22B_ An offence under section 1 of the Human Trafficking and Exploitation (Scotland) Act 2015 (asp 12) (human trafficking).22C_ An offence under section 2 of the Human Trafficking and Exploitation (Criminal Justice and Support for Victims) Act (Northern Ireland) 2015 (c. 2 (N.I.)) (human trafficking).”Member’s explanatory statement

This amendment adds the specified offences to Schedule 7, with the effect that content amounting to those offences counts as priority illegal content.

136C: Schedule 7, page 205, line 36, at end insert—

“32A_ An offence under section 13 of the National Security Act 2023 (foreign interference).”Member’s explanatory statement

This amendment adds the specified offence to Schedule 7, with the effect that content amounting to that offence counts as priority illegal content.

Amendments 136A to 136C agreed.

Amendment 137 not moved.

Schedule 7, as amended, agreed.

Clause 54: “Content that is harmful to children” etc

Amendment 138 not moved.

Clause 54 agreed.

Amendment 138A not moved.

Clauses 55 and 56 agreed.

Clause 57: User identity verification

Amendments 139 and 140 not moved.

Clause 57 agreed.

Clause 58: OFCOM’s guidance about user identity verification

Amendment 141 not moved.

Clause 58 agreed.

Amendment 142 not moved.

Clause 59: Requirement to report CSEA content to the NCA

Amendments 143 to 148 not moved.

Clause 59 agreed.

Clauses 60 to 62 agreed.

Clause 63: Interpretation of this Chapter

Amendments 149 to 153 not moved.

Clause 63 agreed.

Amendment 154 not moved.

Clause 64 agreed.

Clause 65: Further duties about terms of service

Amendments 155 to 157 not moved.

Clause 65 agreed.

Amendment 158 not moved.

Clause 66: OFCOM’s guidance about duties set out in sections 64 and 65

Amendments 159 and 160 not moved.

Clause 66 agreed.

Clause 67 agreed.

House resumed.