Motion for leave to bring in a Bill (Standing Order No. 23)
I beg to move,
That leave be given to bring in a Bill to make administrators and moderators of certain online forums responsible for content published on those forums; to require such administrators and moderators to remove certain content; to require platforms to publish information about such forums; and for connected purposes.
On 18 June 2017, Darren Osborne drove a van into a crowd of people gathered outside a north London mosque, killing one man and injuring 12 people. He had also intended to murder the Leader of the Opposition and the Mayor of London. Just one month earlier, Osborne was a different person with no history of extreme ideology or beliefs. His ex-partner claimed he had been radicalised in just three weeks. The ground was laid by his mental health problems, long-term unemployment and alcohol abuse, but the catalyst was a television drama about grooming gangs in Rochdale. After seeing it, Osborne spent the next three weeks consuming anti-Muslim extremist propaganda online from sources including Tommy Robinson and Britain First, after which he was ready to kill innocent people.
Whether it is inspiring domestic terrorism, young British people being pulled into fighting for terrorists overseas, or facilitating the resurgence of the far right, online radicalisation is a frighteningly dangerous process, which we are not currently equipped to counter. Moreover, the spreading of hate, racism, misogyny, antisemitism or misinformation, knowingly and without any accountability, is dangerous and is having a profound effect on our society.
Our newspapers, broadcasters and other publishers are held to high standards, yet online groups, some of which have more power and reach than newspapers, are not held to any standards, nor are they accountable. It is about time the law caught up. The Bill is an attempt to take one step towards putting that right. It would make those who run large online forums accountable for the material they publish, which I believe would prevent them from being used to spread hate and disinformation, and for criminal activity. It would also stop groups being completely secret.
Most of the examples I will use in my speech relate to forums or groups on Facebook. As the most popular social media platform, it is the subject of the most research and press coverage on this issue. Facebook groups range from a few members to hundreds of thousands. They are run by administrators and moderators, who are charged with upholding Facebook’s community standards. Groups are set up to help old school or university friends keep in touch, and for people with a common hobby or interest to come together for the first time. However, there are also numerous examples of forums being used for utterly appalling ends, and deliberately so.
One such group is the Young Right Society, which is run by a Breitbart journalist and was uncovered by Hope not Hate. The charity revealed that the group was
“frequently awash with appalling racist”
content, white supremacy, jokes about the holocaust, and antisemitic conspiracy theories. It was also used to organise the members for events. Because it was set to secret, it was only exposed when one member alerted the charity to its existence.
It is not only racism that is allowed to thrive on such forums. Marines United was a secret group of 50,000 current or ex-servicemen from the US and British militaries. Members used the forum to share nude photos of their fellow servicewomen. A whistleblower described
“revenge porn, creepy stalker-like photos taken of girls in public, talk about rape, racist comments.”
That it was allowed to grow to a membership of tens of thousands before anyone thought something had to be done demonstrates the need for greater transparency.
Research by Professor Jen Golbeck of the University of Maryland shows that members of online forums feel at risk of becoming a social pariah in closed groups if they report objectionable posts, so they do not. It is the format of the forums that normalises the content for their members. People will receive far more “likes” for a post to a group, which will be seen by a far wider audience, than something they post about their personal life on their own page, but any posts that go against the grain result in harsh criticism and potential exclusion from the group. That can have a profound effect on society’s attitudes on a number of issues. Many of the people in the forums will not have joined because they agree with the most extreme content, but if they regularly read posts espousing unacceptable or wrong content that go unchecked, it can alter their perceptions.
There are already several examples of such forums having troubling effects in the real world. Canada’s largest far-right organisation started as a Facebook group on which Islamophobic conspiracies were circulated, the claimed Islamification of Canada was criticised, and members were encouraged to send pigs’ heads and blood to mosques—plans that were then carried out.
The most disturbing example I have come across is a forum of 8,000 parents of autistic children here in Britain. It is the sort of group that Facebook might point to as an example of how the site can bring people in a difficult situation together to share their experiences, but it was being used in a truly horrific way. Members told the group that giving their child so-called “bleach enemas” would cure their autism. Several parents carried them out and then shared images of what they believed to be parasites—but which were, in fact, the children’s bowel linings—leaving their children after the process. The group’s secret setting meant that charities like the National Autistic Society were locked out until one member contacted the police.
One thing we know is that social media companies cannot be relied upon to monitor and regulate the forums themselves. They have proven that time and again.
The Russian Internet Research Agency’s actions in the lead-up to the 2016 US presidential election show that these forums are an open door to our democracies. Our political parties are also struggling with the phenomenon. Abusive and hateful views and misinformation about politics or politicians can quickly take hold unchallenged, and unacceptable views can become normalised, such as antisemitism or that violence is an appropriate political tactic.
I am talking not about censorship or small private gatherings but about accountability for powerful and large-scale publishing and sharing. Social media can be a wonderful thing that brings together people who might otherwise have never met, but there is no reason why we cannot maintain that while also regulating the space to prevent its abuse. So far we have largely left the online world as a lawless wild west. If 1,000-plus people met in a town hall to incite violence towards a political opponent, or to incite racism or hate, we would know about it and deal with it. The same cannot be said of the online world.
Figures from the House of Commons Library show that, in 2007, 498 people were convicted under the Communications Act 2003 for sending grossly offensive messages. After a decade in which social media became a prominent part of most people’s lives, this figure had nearly tripled, but the Act they were convicted under was passed in 2003, before Facebook even existed. Stalking and harassment represent 60% of online crime, but, despite the increase, online hate crimes are still rarely prosecuted and go largely unreported. Our laws desperately need to catch up. Today I am proposing a small step to establish clear accountability in law for what is published on online forums and to force those who run the forums no longer to permit hate, disinformation and criminal activity. I commend the Bill to the House.
Question put and agreed to.
Ordered,
That Lucy Powell, Nicky Morgan, Robert Halfon, Robert Neil, Mr David Lammy, Anna Soubry, Mr Jacob Rees-Mogg, Ruth Smeeth, Luciana Berger, Stella Creasy and Jess Phillips present the Bill.
Lucy Powell accordingly presented the Bill.
Bill read the First time; to be read a Second time on Friday 26 October, and to be printed (Bill 263).