Skip to main content

Online Harms White Paper

Volume 797: debated on Tuesday 30 April 2019

Motion to Take Note

Moved by

My Lords, I repeated a Statement in the House on the online harms White Paper on the day that we published it, 8 April. There was not enough time for all noble Lords who wanted to contribute to do so, and so the Chief Whip kindly made me available for noble Lords to make their points at greater length and with the benefit of more time to think about this difficult problem. I am grateful for the opportunity to listen to noble Lords’ views.

This White Paper is an important document and a world first. For many people nowadays, the internet is an integral part of daily life. However, illegal and unacceptable content and activity remain far too prevalent online. There is currently a range of voluntary initiatives that try to address some of these problems, but while there has been some progress, the efficacy and pace of these actions have varied widely across different companies. These inconsistencies still leave too many users unsafe online, and the current regulatory landscape lacks the scope and coherence to tackle this complex set of problems. That is why we have published this White Paper, which sets out an ambitious and coherent framework for tackling harmful content and activity. This will make companies more responsible for their users’ safety online, especially that of children and other vulnerable groups, and will help build trust in digital markets. The online harms we are tackling include behaviour that threatens users, particularly children and the vulnerable, and behaviour that undermines our national security or aims to fracture the bonds of our community and our democracy.

To tackle these harms, we intend to establish in law a new duty of care on companies towards their users, overseen by an independent regulator. This regulator will set clear safety standards, backed up by mandatory reporting requirements and effective enforcement powers. Companies will be held to account for tackling a comprehensive set of online harms ranging from illegal activity and content to behaviours that might not be illegal but are none the less highly damaging to individuals and society. They will be required to take particularly robust action to tackle terrorist content and online child sexual exploitation and abuse.

We recognise that a very wide range of businesses, such as retailers, consumer brands and service providers of all kinds, currently enable some degree of user interaction or user-generated content online. Although we will minimise excessive burdens according to the size and resources of organisations, all companies will be required to take reasonable and proportionate action to tackle harms on their services.

The regulator will have sufficient enforcement powers to take effective action against companies that breach regulatory requirements and to uphold public confidence, while also being fair and proportionate. These will include the power to levy substantial fines, and we are consulting on even more stringent sanctions.

As a world leader in emerging technologies and innovative regulation, the UK is well placed to seize the opportunities presented by the measures set out in the White Paper. We want technology itself to be part of the solution, and we propose measures to boost the tech safety sector in the UK, as well as measures to help users manage their safety online. Furthermore, we believe that this approach can lead to a new, global approach to online safety that supports our democratic values and promotes a free, open and secure internet. The Government will look to work with other countries to build an international consensus behind it. We will seek to work with international partners to build agreement and identify common approaches to keep citizens safe online. Having these relationships will support the UK’s ability to put pressure on companies whose primary base is overseas.

Since the White Paper was published earlier this month, the reaction has been generally positive. Noble Lords who spoke in the earlier debate, and Members in the other place, welcomed the Government’s action in this crucial area, and much, although not all, of the media coverage has also been supportive. However, I would like to focus on a couple of areas where our proposals have come under close scrutiny.

First, there has been comment in some newspapers that the measures we have set out in the White Paper will fetter the freedom of the press. I reassure noble Lords that that is not the case. The Government strongly support press freedom and editorial independence. A vibrant, independent, plural and free press that is able to hold the powerful to account is essential to our democracy. Journalistic or editorial content will not be affected by the regulatory framework that we are putting in place. Furthermore, the regulator will have a legal duty to pay due regard to protecting users’ rights online—in particular, their privacy and freedom of expression. The regulator will not be responsible for policing truth and accuracy online.

There is a question of whether newspapers’ comment sections will fall within the scope of the online regulator. We are consulting on proposals for the statutory duty of care to apply to companies that allow users to share or discover user-generated content or interact with each other online. However, as the Secretary of State made clear in the other place, where these services are already well regulated, as is the case with IPSO and Impress regarding their members’ moderated comment sections, we will not duplicate those efforts.

The second area where concerns have been expressed since the White Paper’s launch concerns the potential burdens on small and medium-sized enterprises. Companies within scope will include SMEs and start-ups, but a key element of the regulator’s approach will be the principle of proportionality. The regulator will be required to assess companies according to their size and resources. The regulator will also take a risk-based approach, focusing initially on companies whose services pose the biggest risk of harm to users, based on factors including the scale of the service. The regulator will have a legal duty to pay due regard to innovation— indeed, the regulatory framework set out in the White Paper is pro innovation and will preserve the openness and enterprise that lie at the heart of the UK’s flourishing tech sector.

I believe that we have both a duty to act to protect UK citizens and an opportunity to lead the world on this issue. I firmly believe that this White Paper is a valuable step forward in creating a safer and stronger internet that works for the benefit of all humankind. To get this right, we will need to work with our civil society, our technology sector and, of course, Members of both Houses. We are consulting on the White Paper and have already received around 1,000 responses. As part of that, I am looking forward to hearing noble Lords’ contributions. I beg to move.

My Lords, I am very happy to contribute to the positive way in which the Minister has presented the case. I am delighted that, as promised by the Government, time has been made for adequate consideration of the issues in this debate. However, I am disappointed that more people are not here. There was such a swell of enthusiasm when this matter came before us the first time that I thought we would have a much better-peopled debate and a longer list of speakers. However, we are here and the ideas are waiting to be explored.

I am happy that this debate is taking place during the period of consultation and I hope that the record of this debate will contribute to the documentation being considered. It will make it 1,001 contributions thus far.

Bold claims have been made for what is hoped to be the result of this process. By the way, it is good to start with a White Paper and with regular rounds of conversations. The bold claims include the Government saying that they are going to create the safest place to be online and that this will be a world first, with no one having done it before. They also say that it could be part of a global response to perceived needs in this area. I feel that we are making something available for our country by way of regulation in respect of a global industry that is very difficult to contain within any framework that I can imagine. We will be hearing from various speakers about regulation, so I shall not deal with that now. The duty of care has already been mentioned. I wish that the digital charter had crept somewhere into the narrative because there are lots of ethical issues that would make it very appropriate to consider it.

There is much else in the White Paper but I want to focus on the list of harms on page 31. I shall not go through them all but I note the three columns headed “Harms with a clear definition”, “Harms with a less clear definition” and “Underage exposure to legal content”. There is a list beneath each heading. I want to compare those lists with the ones that appear in another DCMS document. I was reading it not for this debate, to be quite honest, but for the debate last week on advertising and the internet. It came out of the same stable as the White Paper. I am calling it the Plum report because that is what is on the front cover. It is called Online Advertising in the UK. It was commissioned by the Department for Digital, Culture, Media and Sport, and it was published in January 2019, when drafts of the White Paper must have been in DCMS. As I said, it is from the same stable. On pages 17 to 19 of this report there are three lists of potential harms to be found online. They have different names from those in the White Paper: individual harms, societal harms and economic harms. This document was produced with the debate on the Communications Committee’s report on advertising and the internet in mind, and to feed into the Cairncross report on local journalism. But the two lists—in the White Paper and the Plum report—must be looked at together. They are rather unlike each other and point to things that we dare not ignore.

After the debate on the Statement, to which the Minister referred, I had a conversation on the Floor of the House with the noble Baroness, Lady Neville-Rolfe, who I am sorry to say is not in her place today. She was worried about the absence in the White Paper of any reference to economic harms. I do not believe she was thinking about the responsibilities of small and medium-sized businesses, which would be the same, proportionately, as those of other institutions and bodies; she was talking about online harms to these small and medium-sized businesses. These concerns have been picked up by other commentators too.

The list of economic harms in the Plum report includes:

“Product bundling and exclusivity … ‘Walled Gardens’”,

on which stakeholders express concern that it is hard,

“to export user ID data collected during advertising campaigns”.

The list also includes:

“Lack of transparency in programmatic display … Differential treatment”—

whereby some companies are given better treatment and so on—as well as “leveraging”, “engagement with industry initiatives”, in which market players “do not always adopt” industry standardisation, and “control of web browsers”. It is quite a list, and of a different kind from the one in the White Paper. I wanted to keep these lists together.

After that same debate, I had another conversation, this time with the noble Baroness, Lady O’Neill. It is always a frightening experience to talk to the noble Baroness; she is clever and I do not feel that I am. If I felt even a little clever, I would feel much less so after a conversation with her than I did when I began. She is a quite remarkable woman, whose recent publications are on the subject of trust. Her earlier work was on Immanuel Kant, whom I have barely ever understood; the right reverend Prelate will be better versed in him than I am. These books on trust, however, seem to be looking, as a philosopher should, at a very important subject. Anyway, in this conversation, the noble Baroness expressed her worries about the lack of reference in the White Paper to societal harms. She and I have been greatly impressed by—and shared our impressions of—the recent book Democracy Hacked by Martin Moore, which looks forensically at the damage done online to our democratic institutions.

On societal harms, the list revealed in the Plum report is again very revealing. It includes,

“financial support for publishers of offensive or harmful content”—

that is, providing means of monetisation for those creating harmful content on platforms—as well as discrimination, which can occur either by design or inadvertently when advertisers target data to categorise people by gender, ethnicity and race. The list also includes “non-transparent political advertising”, whereby anonymous actors may “influence elections and referendums”.

It is interesting that in tomorrow’s Oral Questions, the noble Baroness will ask a Question on this subject. I am sure she will want to quote the sympathy of the Information Commissioner, Elizabeth Denham, on this very matter. The contribution I want to make as the subject opens up today is to identify and, in some way, feel comfortable with, the range of online harms that we are referring to. They tend to be, as in the White Paper, to do with the plight of individuals. If that is the desired outcome, it ought to be said clearly that this is what we are dealing with. But online harm is a much more generic term and the economic and societal aspects deserve to be mentioned.

I conclude by saying that the Secretary of State has set himself a very difficult target. He wants a Bill that will put the UK’s house in order on a truly global matter of concern. How that will be done we wait to see. The proposals aim to get the right balance between the long-overdue regulation in this area and continuing adherence to the principles of free speech; the Minister has already given assurances on that. He is also looking to produce legislation that, while he gives it his best attention, will be overtaken by rapid development in the field of technology, even as we debate the Bill. We must look for a Bill that is light on its feet, flexible and can be put to work, rather than something static, heavy and fixed that will be out of date as soon as it becomes an Act of Parliament.

I look forward to hearing other views because, at this stage, this is a conversation. I look forward to shaping a document that, ultimately, will go beyond what we are comfortable with as a step in the right direction and needs to go much further.

My Lords, it is always a pleasure to follow the noble Lord, Lord Griffiths of Burry Port, and I certainly want to follow the spirit of his intervention. Last Thursday, we had something of a dress rehearsal for this debate when we discussed the Communications Committee report UK Advertising in a Digital Age.

In the course of that debate the right reverend Prelate the Bishop of Durham quoted his son saying:

“Dad, you haven’t a clue … I have been raised in this digital world. I am inside it, day in and day out. You just don’t get it and your generation will struggle to”.—[Official Report, 25/4/19; col. 725.]

I was particularly sensitive to those comments because I suspect that my own children, all in their 20s, have a similar view of my capabilities. I am happy that my noble friends Lady Grender, Lady Benjamin and Lord Storey, all more savvy in this area than I am, will follow.

The truth is that the gap in comprehension between legislators and practitioners was there for all to see when the CEO of Facebook, Mark Zuckerberg, appeared before a Senate committee. The question out there is whether the Government and Parliament—as the noble Lord, Lord Griffiths, has just indicated—are flexible and nimble enough to address genuine public concerns and stay ahead of the curve as some of these technologies develop at breakneck speed. Perhaps it is a job for the Youth Parliament rather than this one. As I said last Thursday, many of our procedures and conventions have their roots in the 18th century not the 21st. In approaching this, therefore, we have to look not only at the legislation but at how we consult and involve people in introducing steps as we go forward.

As the Minister has said, there has been a general welcome for the direction of travel proposed by the White Paper. There are harms which need to be addressed, as demonstrated by the list referred to by the noble Lord, Lord Griffiths, and as explained to us by the noble Baroness, Lady Blackwood, in the Statement that preceded this discussion.

It is true that the White Paper is not without its critics. Last week, in evidence to a DCMS sub-committee, the Information Commissioner, Elizabeth Denham, expressed surprise and disappointment that the White Paper had not,

“done a comprehensive examination of political advertising and oversight that’s needed in this space”,

and the Electoral Commission has called for a range of measures to strengthen its oversight and promote transparency in digital campaigning. The Alliance for Intellectual Property caught some of the points made by the Minister and the noble Lord, Lord Griffiths, about the effect on business. It said:

“The paper fails to address the harmful activity that affects businesses, in terms of revenue generation, investment and creative innovation”.

Another group, Defend Digital Me, warns against giving the Home Office carte blanche to regulate the internet, saying that children must not be the excuse that is talked up into a reason enabling greater control of the internet by the Home Office. It expresses particular concern about paragraph 21 of the White Paper.

There are real and present dangers out there to be addressed, but also concerns that ill thought-out measures could undermine some of the real benefits that the internet has brought us. The challenge is to produce an internet that is open and vibrant, yet also protects its users from harm. The days are long gone when public opinion was content to see the internet as a kind of Wild West beyond the rule of law. We have now reached a situation where Mark Zuckerberg of Facebook said:

“If the rules for the internet were being written from scratch today, I don’t think people would want private companies to be making … decisions around speech, elections and data privacy without a more … democratic process”.

Quite so.

Nor are we starting from an entirely blank sheet of paper. In the Information Commissioner, we have someone with authority and respect both at home and abroad. We have an Electoral Commission that will need extra resources and new powers to protect our democracy from abuse carried out using new technologies. Ofcom, a creation of the Communications Act 2003, has proved a highly successful and respected regulator. We also have good examples of international co-operation in the field. The EU general data protection regulation is now embedded in the Data Protection Act 2018 and is a good example of addressing online harms via international co-operation. I understand that the GDPR is now being looked at by a number of other jurisdictions, which are using it as a template for their own legislation. Taking up the point that the Minister made in his opening remarks, I see no reason why we should not aspire to global conventions which the whole world can adopt.

In so doing, we must be aware that elsewhere in the world, authoritarian Governments are attempting to insulate themselves from transparency and accountability by trying to curb and shackle the internet, precisely because of its ability to shine light into dark corners. Of course we want to see the freedom of the press upheld. I think the technology is taking us into difficult areas here. There is an overlap between print media organisations and their online publications, and there are questions about where the various jurisdictions apply. Before those organisations get too indignant, it is interesting to note that the worst offender following the Christchurch tragedy was Mail Online, which continued to carry a video of the tragedy, and the manifesto behind it, long after Facebook had taken them down.

The White Paper paints a very broad canvas and, as I have cited, critics call for more action and greater safeguards. I just wonder whether draft legislation would not benefit from pre-legislative scrutiny along the lines of the Puttnam committee, which examined the Communications Bill in 2002 and on which I served. I am delighted to see the noble Lord, Lord Puttnam, in his place. That committee held hearings in public and on the air. As a Joint Committee, it was able to draw on strengths and experiences from both Houses.

By the end of the process, we will have a suite of powerful regulators overseeing these matters: the new super-regulator envisaged by the White Paper, the ICO, Ofcom, a better resourced and empowered Electoral Commission and a revitalised CMA. But how will they work together? Who will report to whom? Will some take responsibilities already held by other regulators? There is a lot of thinking to be done. In the Statement the noble Baroness, Lady Blackwood, spoke about a need for coherence in the way government approaches this. I wonder how interdepartmental co-operation will be achieved. Will there be a special Cabinet committee on this? How will that coherence across Whitehall be achieved?

Parliament, too, will have to give careful thought to how best it links in with this new regulatory framework, either by creating a Standing Committee of both Houses or perhaps by creating an advisory committee akin to the Bank of England’s Monetary Policy Committee, consisting of those best qualified to give advice on new developments in technology which would allow government and Parliament to future-proof as best we can, while keeping oversight of the new technologies within democratic control.

I hope this does not do too much damage to the reputation of the Secretary of State for DCMS, but I worked with him for a couple of years in the coalition Government. I have always admired his lawyerly calm. This will be much needed as we move ahead in this area. There will be great pressure on us to do something quickly. There is obviously a need to bring forward statutory regulation and there will be a need for education and training, to which some of my colleagues will refer. As well as the need to move quickly, there is also a need to get it right. Perhaps in helping to achieve that end, this House might yet prove its usefulness to my children and to the son of the right reverend Prelate the Bishop of Durham.

My Lords, it is a pleasure to follow the noble Lord, Lord McNally. We once appeared on “Question Time” together, although it was the Reading University version, rather than the BBC one.

John Perry Barlow, the libertarian and Grateful Dead lyricist who died last year, wrote in 1996 that the internet was,

“creating a world where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity”.

To national Governments, those,

“weary giants of flesh and steel”,

he directed a famous warning in his Declaration of the Independence of Cyberspace:

“You are not welcome among us. You have no sovereignty where we gather”.

Those words still have the capacity to inspire, particularly in the start-up culture of Silicon Valley where First Amendment freedoms are sacred and trust in government is low. Having been lucky enough, as we all have, to live through the early stages of the communications revolution that is in the process of transforming our world, and having benefited incalculably from the connections it has brought me to people and sources of knowledge that I would never otherwise have encountered, I would go so far as to say, as Wordsworth controversially said of another revolution:

“Bliss was it in that dawn to be alive”.

But as this White Paper repeatedly demonstrates, the scale and intensity with which communication is now possible have brought in their wake the potential for new and serious harms, harms for which counter-speech and alternative narratives are a necessary but insufficient answer. Even the imperative of free speech, central though it is, cuts both ways. Bullies, stalkers and foul-mouthed abusers inhibit the online freedoms of others, in much the same way as anti-social behaviour in the real world drives the most vulnerable from the public square. The risk that free speech will be chilled by overregulation is real and acute. However, underregulation too can inhibit freedom of speech, in particular, the freedom of the women and minority groups who, in Parliament—and, I suspect, elsewhere—attract a disproportionate amount of online abuse. It was saddening, even shocking, to read in the White Paper that 67% of women in the UK experience a feeling of apprehension when thinking about using the internet or social media.

Regulation, to my mind at least, should be a last resort. How sure are we that it is needed? After all, we have laws against the dissemination of terrorist materials, malicious communication, defamation, the incitement of racial and religious hatred and the intentional causing of harassment, alarm and distress. No doubt other such laws could and will be imagined, although never, I hope, the overbroad restrictions on so-called “extremist activity” that were contemplated in the Queen’s Speeches of 2015 and 2016. However, laws of this kind were developed for a world of physical interactions and legal borders. They require perpetrators to be identified and brought to justice in our own jurisdictions. Those who are abroad, or who can effectively ensure their anonymity, cannot be reached. The delicate framework of our analogue laws is not on its own sufficient to contain the turbocharged power of internet communication, let alone to discourage online behaviour that is anti-social rather than unlawful.

What, then, of self-regulation by the internet intermediaries? It already exists, of course, and will continue to be central to any regulatory scheme, but its inadequacy is illustrated by the regular evidence sessions—most recently last week—in which Facebook, Twitter and YouTube are questioned by the Home Affairs Select Committee. They speak of their high standards, terms of service and internal guidelines. They claim credit for recruiting human moderators, for their use of AI and for suspending and deleting accounts. They in turn are criticised for their lack of transparency, for the patchy and inconsistent application of their standards and for their unwillingness to volunteer information that could be of assistance to law enforcement. This is not a satisfactory state of affairs. Partly, that is a function of the sheer size of the task, with hours of video uploaded to YouTube every second, limited numbers of human monitors and algorithms that are good at spotting nipples and rather less good at spotting irony. These problems will continue whoever sets the standards.

But the status quo also reveals a democratic deficit. We in Parliament, not unaccountable executives in California, should be approving the ground rules for those who do business in our country, and an independent regulator, accountable to Parliament, should be encouraging compliance and enforcing where necessary. That is what democratic governance of the internet needs to look like, as some of the tech companies seem now to acknowledge. The concept of the statutory duty of care, proposed by Professor Lorna Woods and given effect in this White Paper, is one that I support, as is the principle that companies cannot realistically be required to check every piece of content before upload for lawfulness, irrespective of whether we remain subject to the e-commerce directive, which requires that to be the case.

Much of the criticism of the White Paper has centred on the use of nebulous terms such as “trolling”, “extremism”, “harm” and “offence”. These are far too broad to be treated as blanket prohibitions with coercive consequences. Some of them could usefully be lost altogether. But in broadcasting at least the concept of harm has proved tolerable in the context of a detailed and context-specific code of practice. My experience of Ofcom, which I should say has extended to representing it at the recent Gaunt case in the English courts and in the European Court of Human Rights, is that with really good guidance even quite broad concepts are capable of being applied with ample regard for human rights. As it explained its approach to me at the time, every case is a freedom of expression case and it starts with a presumption of freedom. It remains to be seen whether the internet platforms are as susceptible to regulation as the broadcasters with which they so often nowadays share a screen. The sheer volume of material and the fact it is not generated by the platforms themselves will ensure that and will require the regulator inevitably to prioritise.

Some platforms might react by overcensoring the content that they allow to be carried—a risk that surely must be mitigated by some mechanism for independent review. Others might display the “refusal to act responsibly” ascribed to Facebook last week by the Privacy Commissioner of Canada in his statement on his Cambridge Analytica investigation—I hope that Sir Nick Clegg was listening. There will be practical difficulties, some of them unexpected, because, as the Minister said when introducing the White Paper, no one has done it before.

Liam Byrne MP has pointed out that, for all the benefits brought by the previous Industrial Revolution, it took numerous Factories Acts over the course of more than a century before its worst excesses could be curbed. This White Paper leaves many important issues for another day—market concentration, unattributable personalised advertising, lack of algorithmic transparency—but I like it more than I expected to, and I hope to participate on its journey into law.

My Lords, this is a vast subject, and I will limit my comments to just a few areas.

I and others on these Benches welcome this White Paper, in particular the attempt to rethink the way we see this whole area. In the past we brought in individual laws to deal with particular problems. My colleague the right reverend Prelate the Bishop of Chelmsford has been arguing for some while that we need to see this as public space. We need to try to understand how we can regulate it from first principles in a way that guarantees the freedoms we want and the huge benefits that have come through the online world, which has made a huge and incalculable difference to our lives, but also protects the many people who are vulnerable. We have heard some account of just some of the problems some people have faced.

If the Government are to achieve the aim of making this country the safest place in the world to go online, we need to learn from other industries that are also seeking to be regulated. The whole question that I have been most closely associated with and have taken a particular interest in is regulating the gambling industry. It seems to me that there are a number of parallels that we need to take on board if we are to think about what might be the appropriate way of regulating.

It has been encouraging that the Gambling Commission has taken a stronger line on an industry that in the past performed abysmally in its duty of care to its customers. If companies such as Facebook, Snapchat or YouTube are to behave, the regulators will need to have significant powers and there will need to be real independence. Yet, if they look at some of the other regulators, why on earth are they going to fear? For example, just as the gambling industry’s gross gaming yield continues to grow far into the billions, Facebook’s revenue is now around £55 billion per annum. The substantial fines that we were being perennially promised in the battle to combat wrongful behaviour are very modest and really do not make the companies blink for one moment. Indeed, some cynics have been arguing that some of these companies simply budget in the fines as part of their ongoing business so that they can keep going as they have in the past. Similarly, the largest fine Facebook has received from a UK watchdog is around £500,000. These companies are in a totally different league. Therefore, there is a question about not only how we regulate them but how we get them to engage with the wider debate about the sort of world we want to create.

When I speak with families who have lost loved ones to gambling-related harms, they want to know why companies rarely lose their licences. It appears that the larger high street companies have little to fear that that might be the case. Putting it very bluntly, one of the questions I want to ask is: could it be envisaged, under the proposals as they emerge, that some of these companies could actually find their licences being revoked if they are not able or willing to deliver the public goods we need them to deliver?

The White Paper, I suppose not surprisingly, gives limited details on the funding and membership of the regulator and the regulatory body. I would be very concerned if a solely industry-funded organisation might lead to a culture of mistrust, especially surrounding the urgent need to have independent research and scrutiny. The regulator and any regulatory bodies need to be completely independent of the industry. We will be kidding ourselves if we do not think that the industry is already recruiting and deploying people to lobby individual Members of both Houses. I am sure that is already going on.

But all this is irrelevant if the regulators themselves are ignored. Again going back to gambling, the issue I have been closely associated with, last week I was surprised to see that the Minister for Sport and Gambling appeared to dismiss a call for a mandatory levy, just minutes after the chair of the industry regulator had precisely called for one. Can the Government pledge to this House that any similar tough calls from the regulator will be championed, not rejected?

I have one or two other concerns that I want to touch on at this early stage of the discussion we are having, as we set out various issues to be debated. For example, the White Paper clearly highlights the use of addiction by certain sorts of products online. This is about not just gambling but gaming. The evidence has been growing very consistently that many things online can be hugely addictive; indeed, they are designed to keep you online for long periods. Many people will be aware that one of the issues is that a large proportion of company profits stem from the small problem group of people who are addicted to gambling, gaming or whatever it is. Indeed, they rather rely on it for their profits. Yet we do not seem to have much in this White Paper about how to deal with these products, which are designed to be addictive. This is something that was picked up by the noble Baroness, Lady Kidron, in her report Disrupted Childhood.

What are we going to do to address these things taking hold of people’s time and energy, and in some cases becoming quite obsessive? For example, will future legislation require gaming companies to have a “pause” function, allowing people to stop and take a breath? What about the mechanism to regulate “dark nudges”, which I have been reading about? People with potential addictions find that, at just the moment they try to come off, some extra new thing is offered to them in an uncanny way that seems to have been designed to do so. These are particularly problematic when you talk to people who are recovering or are addicts. How are we going to address these issues? With so many online companies using addictive products, it would be good for many users if an as yet unnamed future regulator ensured we had some research on how to deal with this sort of issue.

I dread to think of the Minister thinking that a regulator is going to be a silver bullet. This is a much bigger societal issue that we have to go on debating. I am concerned that, while the White Paper portrays the experience of the gambling sector as a land illuminated by sunlit uplands, all due to an industry regulator, that is not how it appears to many of the addicts or their families who have lost loved ones. Vulnerable people and children using the internet deserve the right level of regulation. That will involve some self-regulation, though I have to say that in my experience, talking with people, self-regulation is not working very well in other industries. Just last week, Snapchat was revealed to have extraordinarily weak standards of age verification.

I have been listening very carefully and have followed the debate from the beginning. Would the right reverend Prelate accept that, given that there is a greater urgency about this matter than just dealing with the White Paper—although that is extremely welcome and constructive—and given that even as we speak, primary school children are accessing hardcore porn, sadly it is now time for rather draconian measures?

I thank the noble Lord for his question. It was precisely my point, about a regulator having very significant powers. My question about whether self-regulation will work touches on a number of areas. That was why I was saying that I do not think self-regulation will work. Although it is something that we need to encourage, it does not yet have a track record that we can have much confidence in.

If I may finish—the noble Lord caught me at the very end of what I wanted to say—that is why I do not think that a light-touch approach is going to be the answer. We need to work away at how to balance these various needs, both for the freedoms and for the protection of the vulnerable.

My Lords, I am very happy to follow the right reverend Prelate, and I agree with much of what he says. I do not quite agree with him on everything, but I will come to that in a moment.

Generally I welcome this White Paper. During my time as a Member of the European Parliament, I became what is loosely termed as an Internet Watch Foundation champion. I do not think I have ever been a champion of anything, but I was very happy to support the Internet Watch Foundation, which has done enormously good work for many years detecting and trying to deal with instances of abuse on the internet. In fact, quite a long time ago, in 2011, I was the European Parliament negotiator when it was decided that child pornography and material facilitating child abuse should be removed from the internet whenever possible, and blocked when removal was not possible. Of course, at that time, there were not really the levers or tools in place to help deal with these matters. The IWF continues to do good work, and I hope it continues to be part of what it has always regarded as a partnership approach, in the new regulatory framework.

I disagree a little with the right reverend Prelate, and, it seems, with some colleagues, on this question of self-regulation. We should also be aware that in hosting websites, there has been much greater success in this country than in others in removing content. That has, until now, inevitably been part of a self-regulatory approach. I welcome the setting up of a regulator, although my experience of other regulators is a little mixed. Regulators have to have clear powers and be able to enforce the regulations that they are responsible for. I certainly welcome the suggestion of codes of conduct for the industry—the service providers—but in preparing these it is vital that we have full consultation, as I know we are currently having on the White Paper, with industry and with relevant NGOs, including the Internet Watch Foundation and the law enforcement authorities.

Also, any proposals must be adaptable. I was dealing with this matter in 2011, but even before that, we were aware of the emergence of the internet, but did not fully understand how it would develop. Therefore, we need to think of this as being part of what I would describe as smart legislation—we need to ensure that we can adapt and change when the circumstances and the technology change. I think a co-operative approach—a partnership basis—should remain in place, as well as having a regulator to deal with some of the worst offences or threats, because if you take away this partnership approach and this self-regulatory element, you will have great difficulties in maintaining the necessary good will, which is very important, particularly in dealing with something that is not just British, not just domestic, but is essentially very international in its implications.

The regulatory regime should be a last resort, if other means are not achieving the ends you need. There is an inherent risk in what is being proposed here, that it could lead to legalistic and obstructive action, with proscription replacing persuasion or agreement, sometimes with positive outcomes. The good intentions of a regulator must not stamp on or pre-empt the good will I referred to, and the voluntary rectification, but as I have said, the UK is, in many ways, in the lead. We are not the worst country. We are not a country where these abuses are particularly noted. We have the fastest removal rates for offensive material in the world. Industry has responded to concerns, and new tools such as web-crowding technology, image classifiers, image hashing and webpage blocking are regularly deployed. That is why I think it is best to mix punitive measures fairly with maximum co-operation. Child sexual abuse imagery hosted in the UK is now at a lower level than it was 15 years ago. Of 105,000 webpages found to contain such material in 2018, only 41 of them were hosted in the United Kingdom. That is nothing to be proud of in international terms, but it shows that our hosting, and the effects of what we are trying to do with host sites, are having some results.

The National Crime Agency estimates that at least 80,000 people regularly view child sexual abuse images in the UK. However, they are viewing them mostly on sites hosted outside this country. One of the White Paper’s conclusions is that the proposed regulator’s powers may not, in my opinion, have a sufficiently open-door approach to those who wish to report offensive material. There is reference to the need for redress and to have a very effective redress system. However, it needs more than that. We need to make sure that the regulator is working not just in isolation but with others, as I said before.

Regarding the funding of a possible new regulator, we would obviously look first to industry to pick up the bill. That is important, but the right reverend Prelate spoke about GambleAware. Speaking as a former Gambling Minister—I may have put that wrongly; I meant as a Minister formerly responsible for gambling in this country—comparisons with GambleAware and Drinkaware are probably not terribly helpful because this is a very different case. In those cases, the drink and gambling industries come forward with proposals to suggest limitation on activity, whereas of course we are talking about elimination rather than limitation.

Finally, I would like to refer to the current provisions mentioned in the White Paper. I think one or two noble Lords have also referred to them. First, I am sure your Lordships will be delighted if I mention that there is quite a lot of EU material here. The EU e-commerce directive of 2000 was referred to a few moments ago by the noble Lord, Lord Anderson. That directive was certainly important but it did not make service providers liable for content; it made them obliged to remove illegal content but there was no obligation under article 15 to continue the monitoring of those sites in a way that I suggest we think they should be.

Secondly, the big point is that I was a shadow rapporteur on the EU general data protection regulation. Now that is important legislation, and it will have an impact here not on everything but certainly on the activities of the regulator and the areas of redress. When my noble friend the Minister winds up this debate, perhaps he could make further reference to those EU regulations and directives. How are we going to ensure that in an international setting, with the clear pressures now on us in this area, we are able to replicate them and ensure that our colleagues elsewhere in the world, where much of the abuse of the internet is coming from, will continue to comply with standards and a quality of approach equivalent to that which we have ourselves?

My Lords, I too am extremely grateful to the Chief Whip for allowing time for what is already proving to be a worthwhile and timely debate, and to the Minister for introducing it in such a positive manner. I entirely sympathise with the Government’s instinct to focus on the most obvious forms of online harm, such as child sexual exploitation and abuse, the promotion of terrorism and threats of actual violence. The Government’s proposals in these areas are, on balance, carefully thought through and proportionate. They represent a bold attempt to tackle some of the more damaging features of the digital age and are based on a duty of care—a principle I have long advocated in relation, for example, to what I believe to be the responsibilities of the media to ensure informed democratic debate. I hope it will not be self-promoting to mention that in 2012, I did a TED talk under the auspices of this House on this subject. It has to date been seen by almost 1 million people, so there is no doubt that there is interest in this area.

In the time available this evening, I would like to touch on other forms of harm which in my view are insufficiently addressed in the White Paper. The harms I refer to were identified by the noble Baroness, Lady O’Neill—sadly, she is not in her place—in response to the Statement on 8 April as being,

“harms to public goods, democracy, culture and the standards of the media”.

She made the point that the White Paper,

“deals with only part of the problem”.—[Official Report, 8/4/19; col. 433.]

As your Lordships have already heard from the noble Lord, Lord McNally, during her recent appearance before the DCMS Select Committee on this White Paper, Elizabeth Denham, the Information Commissioner, said—I will quote her a little more fully—that she was,

“surprised and disappointed that there was not more focus on a huge societal harm, which is electoral interference, and on the need for more transparency in political advertising”.

Like others, I entirely share the commissioner’s disappointment.

We are only too familiar with the pernicious and corrosive effects of online propaganda in the form of disinformation, which has already had a distorting effect on almost every area of our domestic and democratic lives. Much of that distorting effect has been caused by a worrying lack of understanding of how easily we can all be manipulated. Anyone doubting the impact of that manipulation has only to turn to the recent DCMS Select Committee report on Disinformation and Fake News. As that report accurately states:

“In a democracy, we need to experience a plurality of voices and, critically, to have the skills, experience and knowledge to gauge the veracity of those voices”.

Many people have rightly commended the Government’s White Paper for being a global trailblazer in its strategy for tackling online harms. However, I doubt whether I will be the only Member of your Lordships’ House to seek a far greater level of clarity, energy and, crucially, investment in what the White Paper describes as digital literacy. In their White Paper, the Government rightly argue that the promotion of digital literacy has a wide range of benefits,

“including for the functioning of democracy by giving users a better understanding of online content and enabling them to distinguish between facts and opinions online”.

This is an entirely laudable objective, but we have been here before. In fact, it was 15 years ago, in relation to the media literacy responsibilities of Ofcom, as set out in what became the Communications Act 2003, which the noble Lord, Lord McNally, has already referred to today. I am sorry to report that successive Governments, including those of my own party, never seriously grasped—let alone ensured—the delivery of Ofcom’s obligations regarding what the White Paper has now accurately rechristened digital literacy. It is true that for a decade, Ofcom made efforts to address this issue but the specific grant used for that purpose was phased out by DCMS several years ago.

It will be argued that some technical research has been published, but surely that is a woefully inadequate response to the real task at hand: to equip present and future generations with the ability to assess the vast swathes of misinformation, even outright lies, which now proliferate across the internet—whether on social media, blogs or what can at first glance appear to be credible news websites. Had we seriously risen to that task, we might have avoided at least some of the deeply troubling outcomes we now face. We could have made a better job of preparing ourselves for the worst impacts of the environmental crisis that millions of young people now rightly warn us against. Even the result of the referendum might have been different if those aged between 18 and 23 had fully understood the importance of an informed vote for their own futures, and the degree to which they were capable of being marginalised and manipulated in the new digital world.

Any 10 year-old at the time of the Communications Act 2003 will now be aged 26. We are talking about literally millions of voters who, with a better understanding of the power of misinformation, could have demanded a more honest debate on the ramifications and potential outcomes of what for many of them may well have been a life-defining moment. It is estimated that 64% of young people aged 18-24, or 3.6 million out of a total of 5.7 million, turned out to vote in the referendum. Of that 64%, it is further estimated that almost three quarters—2.6 million—voted to remain. Surely it is now imperative that as legislators, we develop a laser-like focus on ensuring that people, especially the young, are equipped with the best means to ensure that never again can our democratic processes be subject to the kinds of distortions we all suffered in the months and weeks leading up to 23 June 2016. Consider this hypothesis, if your Lordships will: had all that age group voted in the same proportion as those who did engage, the result would have been a remain victory by over half a million votes. That is how important a truly informed and fully participatory democracy is to our and their futures.

The sad truth is that in a digital age, we cannot regulate misinformation out of existence. Those days, if they ever existed, have long since passed. Instead, we need to take unambiguous responsibility for putting tools in the hands of users to enable them to distinguish between fact and fiction. This is far from being a new problem but its scale has increased exponentially and its new forms are extremely challenging for any Government to combat. When he replies, will the Minister give the House some assurance that DCMS, the Home Office and the Department for Education are actively working together and prepared to invest time, effort and energy into correcting a lamentable decade of inaction?

I close by quoting from a speech made to the American Society of Newspaper Editors in 1925 by the then President of the United States, Calvin Coolidge:

“Wherever despotism abounds, the sources of public information are the first to be brought under its control … It has always been realized, sometimes instinctively, oftentimes expressly, that truth and freedom are inseparable …The public press”—

he was speaking at a time when newspapers were pretty well the only form of information—

“under an autocracy is necessarily a true agency of propaganda. Under a free government it must be the very reverse. Propaganda seeks to present a part of the facts, to distort their relations, and to force conclusions which could not be drawn from a complete and candid survey of all the facts … propaganda seeks to close the mind while education seeks to open it. This has become one of the dangers of the present day”.

As this Bill moves through the House, I will be arguing that digital misinformation has become the greatest single danger to our democracy and that to pretend otherwise is to risk fatally undermining it.

My Lords, I have listened with great interest to the speeches made so far and also read, in some detail, the online harms White Paper. This followed the Green Paper, published in October 2017, in which there was an aspiration to make the UK,

“the safest place in the world to be online”.

This aspiration, which some might call a faint hope, appears again in the executive summary of the White Paper. I also listened to today’s Statement on yesterday’s social media summit and was interested to hear the Minister say that it was agreed,

“to work with experts … to speed up the identification and removal of suicide and self-harm content, and create greater protections online”.

What does the Minister understand “speed up the identification” to mean? Does it mean immediately, within an hour, a day, a week or what?

I am talking about the earlier Oral Statement on the social media summit.

In the past 18 months, we have seen the internet become less safe and more dangerous, for everyone, but in particular for children and young people, who I have a particular interest in. I am not going to talk about the technical aspects of how we might regulate the internet: I am no expert on bandwidth, et cetera, and the only generation I am interested in is the one currently growing up. I have read about 3G, am using 4G and am reading about the opportunities and threats of 5G.

We must ensure that the next generation of computers, and those who profit massively from the industry, exercise a duty of care. Current and future generations of children and young people must be protected so that they can enjoy a fraction of the innocence that we enjoyed. We spent time and money on the thing called the watershed, in an attempt to prevent children watching adult content on terrestrial TV channels. We pay the staff at the British Board of Film Classification to watch every film for which general release is sought, giving each film an age rating. We have established the Video Standards Council to rate video games. Imagine the uproar there would be if the 10 o’clock news had shown the shootings in New Zealand or beheadings by ISIS. However, as the House has heard, when it comes to the internet the only regulation is self-regulation. Even Mr Zuckerberg, one of the worst villains of the internet piece, makes billions while crying crocodile tears about the need for external regulation.

When a gentleman called Mr Ford began to make motor cars, it was soon realised that they could do serious physical damage to people and property. To minimise the damage, a decision was taken to regulate cars; abolishing them was not an option. In England, we have stringent rules on who can drive, the speed at which cars are driven and how drivers must follow the Highway Code. Parents—most of them—teach their children how to cross the road safely. This is reinforced in schools and, as children begin to use roads as cyclists, they are taught how to keep themselves safe. Similarly, car makers are strictly regulated in terms of the safety of passengers and, increasingly, the damage to the environment.

However, the internet, the 21st-century Wild West, seems to have more than its fair share of bandits but no sheriffs to take them on. The internet is, as yet, totally unregulated and is driven by just two motives: making bigger profits or reducing costs. The reason why pornography, to take just one example, is so easily available on the internet is because the internet giants make unbelievably huge amounts of money, directly and indirectly, by hosting pornography sites.

Of course, everyone agrees that young people should not watch extreme violence or pornography and the industry shadow-boxes with parental filters and age limits. However, the research shows that parental filters are easily evaded and age limits are totally ineffective. A decade ago, in a Committee Room in this House, there was a seminar on the dangers posed to children by the internet. There was unanimity, even then, from the Department for Children, Schools and Families, Vodafone and Google that the internet genie was out of the bottle. Since then, successive Governments have talked the talk about protecting children and young people from the hell which is only three clicks away, but no serious attempt has been made to regulate the internet.

I support this White Paper and congratulate the Government on bringing it forward. We should present this not as an attack on freedom of expression but as allowing freedom of expression which does not damage the most vulnerable. I see this as the start of a process. We know that the industry is lobbying hard to protect its profits. We have all heard how it is difficult—which means expensive—to stop offensive and illegal content being readily available.

I pause to reflect on the points made by the noble Lord, Lord Puttnam, about the threat to our society and democracies. We have seen how that has gone on: the presidential election in America was probably affected by bots targeting literally millions upon millions of people. As political parties, we use social media to campaign and we do it in a very effective way, but in the wrong hands these means can be used to turn against democracy. I hope that the Government and the Minister will think hard, in detail, about the points that the noble Lord made.

Internet companies say, “There is nothing we can really do about this”, but just look at what is happening in China. Xi Jinping manages to block anything that does not fit in with his socialist China, often with the agreement of the internet giants themselves, who go along with what he says to ensure their presence in the country. I am not suggesting that we have the same regime as China, but it is possible to put in place algorithms and filters which stop the most harmful effects of the internet. As a Liberal Democrat, I am in favour of individual freedoms, but we also have a duty to ensure that that freedom is constrained by the rights of others.

Children have the right to a childhood, and schools need to educate children to be responsible users of social media. Parents must be empowered to protect their children through digital literacy, advice and support. I hope that the Minister will look carefully at the area of support to schools. The Government will say that schools should be doing more and giving education. The problem is that we have a subject called PSHE—personal, social and health education—which many of us have said should be taught in all schools, but of course academies and free schools can choose not to do PSHE or choose not to talk to children about the problems of ensuring internet safety. Unless we regulate the internet to keep our children safe, we will continue to pay a very high price. Parents of children who have committed suicide know how high that price is.

My Lords, I am very glad to be taking part in this debate on a topic that I have raised in this House on numerous occasions. As the number of people who use the internet and the range of things they use it for expand, we all face new challenges in balancing the good with the potentially harmful. I commend the Government and the Ministers involved in this for rising to the challenge.

The well-being of our children and young people online is at the forefront of this document and is something I have worked at and with, in different ways, over a number of years. I very much welcome the Government’s reiteration of their commitment,

“to support parents in preventing and dealing with online harms”.

I am particularly pleased that, since the publication of the White Paper, the Government have announced that the age verification of pornographic websites will finally come into effect on 15 July. I and other noble Lords will be monitoring the launch and the effect of this closely. I welcome, too, the intention to bring in a duty of care for social media companies. I shall follow the detail of this debate with interest as well, especially the role of the proposed new regulator, which will issue codes of practice on preventing children accessing inappropriate content, including codes on:

“Steps companies should take to ensure children are unable to access inappropriate content, including guidance on age verification, content warnings and measures to filter and block inappropriate content”.

I have also been active in supporting family-friendly filtering by mobile phone operators and internet service providers and am concerned to read the report, Collateral Damage in the War Against Online Harms, published last week by Top10VPN and the Open Rights Group, suggesting that these filters are potentially harmful rather than advantageous. The Minister and I have had discussions about this over the years. I have never suggested that filtering is a panacea for parents; it is merely one tool in their toolbox for supporting their children as they grow up in this increasingly digital world. I look forward to hearing the Minister’s response to this report.

The White Paper sets out the Government’s intention for a new online media literacy strategy. I have always argued for educating the public on the options before them to manage their technology use, and especially to help equip parents to raise their children in an increasingly digital world. I welcome the inclusion within the remit of the strategy of:

“Developing media literacy approaches to tackling violence against women and girls online”.

I hope the Minister, will be able to expand on the plans in this area and how they tie in with the commitment in the Ending Violence against Women and Girls2016 - 2020Strategy Refresh, that the Government are,

“working to better understand whether links exist between consumption of online pornography and harmful attitudes towards women”,

and that the Government will,

“commission research in order to better understand the links between consuming pornography and attitudes to women and girls more broadly”.

I would be grateful if the Minister would give us an update on how these projects are progressing and when he expects the research to be completed.

I recognise that this White Paper cannot cover all online harms and does not intend to do so. The Minister made that clear in the Statement on 8 April. However, given the focus of the paper on social media companies and child sexual abuse, I was expecting it to cover two areas which are missing. I was hoping he would have addressed some of the issues that were raised in this House and the other place about the limitations of the extent of the Digital Economy Act 2017. When we debated the regulations that will determine which websites will be required to have age verification, on December 11 last year, there was considerable comment about the current exclusion of social media websites. The Minister said that this issue might be addressed by the White Paper: sadly, it is not. It would be helpful to understand the Government’s decision not to include social media within age verification when so much of the White Paper is about the responsibilities of social media, and children accessing pornography is one of the harms in the scope of the White Paper.

During the debate on the regulations, I also raised my concerns that the final version of the Digital Economy Act left a significant loophole with respect to non-photographic and animated child sex abuse images. This means that the age-verification regulator cannot ask internet service providers to block websites that contain these images. The same point was made in the other place, to which the Minister, Margot James MP, said:

“That strikes me as a grotesque loophole”.—[Official Report, Commons, 17/12/18; col. 612.]

I am very pleased that, since the debates at the end of last year, the Internet Watch Foundation has adopted a new non-photographic images policy and URL block list, so that websites that contain these images can be blocked by IWF members. It allows for network blocking of non-photographic images to be applied to filtering solutions, and it can prevent pages containing non-photographic images being shown in online search engine results. In 2017, 3,471 reports of alleged non-photographic images of child sexual abuse were made to the IWF; the figure for 2018 was double that, at 7,091 alleged reports. The new IWF policy was introduced only in February, so it is early days to see whether this will be a success. The IWF is unable to remove content unless that content originates in the UK, which of course is rare. The IWF offers this list on a voluntary basis, not a statutory basis as would occur under the Digital Economy Act. Can the Minister please keep the House informed about the success of the new policy and, if necessary, address the loopholes in the legislative proposal arising from this White Paper?

We are debating a document that is clearly a step in the right direction, and I am sure that all noble Lords certainly congratulate the Government on that. I also very much look forward to hearing the Minister address some of the many points raised by other Members, both before I spoke and following me.

My Lords, it is a pleasure to follow the noble Baroness, Lady Howe, and to precede the noble Baroness, Lady Benjamin, both of whom have consistently campaigned on the dangers of the internet to children. I agree with what the right reverend Prelate the Bishop of St Albans said on gambling; I would support a ban on advertising at football matches.

By way of reminding your Lordships of my interests, particularly as a chief officer at TES, a digital education business, and as chair of xRapid, a health tech business, I will start by reminding the House of the upside of the online world. TES has 11.5 million registered users, and, as a platform for teachers, facilitates the global sharing of teaching resources. This saves teachers buckets of time and helps them access a torrent of quality user-generated content. It is inconceivable without the internet. My other interest trains iPhones to do the work of microscopists in diagnosing malaria, which we are now able to give away to those who need it—laboratory quality at zero marginal cost, thanks to online technology. There are many other examples of technology for good, and if we do not grasp them but instead allow our public services to stagnate, we will be left behind as other nations leapfrog our development.

However, the harm of the internet is also a reality. Many of us are working out how to manage it. I am guilty of normally overindulging on my screen time—I am digitally obese. At home, our seven year-old, Coco, asked us just this week whether we can agree as a family our own code for gaining consent if we want to post images of each other on social media. That is a job for this weekend. But there are areas where self-regulation will not apply and where we need urgent government and legislative action.

I urge your Lordships to take 15 minutes to watch Carole Cadwalladr’s brave TED talk, delivered earlier this month in Vancouver. As the journalist who uncovered the Cambridge Analytica scandal, she has credibility in her charge that our democracy has been broken by Facebook. Her argument is compelling. Communities such as Ebbw Vale, with very few immigrants, voted overwhelmingly for Brexit because of their fear of immigration. Such communities are not consumers of the mainstream, right-wing media that stir that particular pot, but they are consumers of Facebook. She describes Facebook as a “crime scene”, where the likes of Nigel Farage were able to oversee what she uncovered. Who knows how much money from who knows where was able to fund of a firehose of lies through Facebook ads. These were targeted at those who were most vulnerable to believing them, using the illegal hack of personal data from tens of millions of users.

The online harm to individuals, as other noble Lords have talked about, is profound, but there can be no greater harm to a nation state than the catastrophe of Brexit, brought about by referendum won by illegal campaigning—and we allow Nigel Farage to start another party to dupe the nation once more. We desperately need to update our electoral law to prevent this destruction of our democracy, and I hope that the legislation following this White Paper may present some opportunities for us to do so.

I must also say that I commend this White Paper. I inevitably want it to go further, but the core proposals of a duty of care and of a regulator are sound. As the manager of a TES resources platform, I welcome those regulatory burdens. I am particularly delighted to see the duty of care principle. For some time I have been keen to see this well-established legal principle from the physical world come into the virtual world. I was introduced to the notion by Will Perrin and I pay tribute to him and his collaborators at the Carnegie Trust, and to the Government for listening to them. My assumption has been that, when applied, this will generate civil action in the courts by victims against technology operators for the damage caused by their algorithms and other relevant actions. Can the Minister say whether this will be available under the government plans, or will redress be available only through the regulator?

Speaking of victims of algorithms, I am also interested in whether the measures here will apply to the Government themselves and other public bodies. Can the Minister please help me? I have spoken before about the worrying case of the sentencing algorithm used in Wisconsin courts that defence attorneys were prevented from examining. We have had another example closer to home. Last year it came to light that our Home Office had deported potentially thousands of students, using a contractor analysing voice recordings with a machine. They asked the Educational Testing Service to analyse voice files to work out if students were using proxies to sit the English tests for them, and an immigration appeal tribunal in 2016 heard that when ETS’s voice analysis was checked with a human follow-up, the computer had been correct in only 80% of cases—meaning that some 7,000 students had their visas revoked in error.

Given what we know about algorithmic bias, and the growing use of algorithms for public service delivery, it is critical that public bodies are also subject to the measures set out in the White Paper. I would also say that, since the Government are increasingly building technology platforms to compete with the private sector, it would be unfair not to impose the same regulatory burdens upon them as there are on those of us working in the commercial world.

My final point relates to the valid point made in the document that technology can be part of the solution. I agree. But there is a danger that the demands placed on technology companies will assume that they are all of the size and wealth of Facebook, Amazon, Google and Apple. This would be a mistake. They can afford to develop solutions and gain a competitive advantage over smaller businesses as a result. We need to ensure that these measures result in a more, not less, competitive landscape. If there are technology solutions to solve difficult problems such as the copyright infringements that I grapple with or other thefts of intellectual property, those tools should be openly available to platform providers of all sizes.

When Sir Tim Berners-Lee invented the web he had a great vision that it should be for everyone. Earlier this month he said that the internet,

“seemed like a good idea at the time”,

that the world was certainly better for it, but that,

“in the last few years, a different mindset has emerged”.

At the 30-year point, people have become worried about their personal data, but they,

“didn’t think about it very much until Cambridge Analytica”.

The privacy risk, however, “is subtle”, he argued:

“It’s realising that all this user generated data is being used to build profiles of me and everyone like me—for targeted ads and more importantly, voting manipulation. It’s not about the privacy of photographs, but where my data is abused”.

We need new duties on technology companies and we need a regulator with teeth. I wish the Government well and I hope that we will see legislation on this very soon.

My Lords, it is an honour to follow the noble Lord, Lord Knight, and I too congratulate the Government on bringing forward this important online harms White Paper.

I have been speaking out about finding ways to protect the vulnerable and impressionable online for almost two decades now. When I was on the Ofcom Content Board 16 years ago, I continually raised my concerns and pleaded for online regulation. But at the time such ideas were considered by many to be an assault on freedom of expression, and it was thought that the internet was an open space where regulation had no place. How things have changed. Today, through this White Paper, we are now about to change the world and bring morality, integrity and trust to the forefront of the online world for the betterment of humanity.

There is no doubt that the internet is a place not only where the best of human spirit blossoms but where the worst and most sordid elements of the human condition can be expressed, shared and amplified on a global scale. Without doubt, the internet and the digital revolution are changing the world. The Pandora’s box of limitless access to information has been opened and the progress of technology seems unstoppable.

But not all progress take us forward. Emerging evidence shows that children are being exposed to a vast range of online harms: pornography, inappropriate content, online gambling, body shaming, suicide, bullying, eating disorders, online grooming. The list goes on and on. It is not just children who are targeted, but adults, too—especially vulnerable groups and those in public life. They are having to deal with fake news and extreme political, racial and religious ideology, as well as hate crime, fraud and blackmail. The impacts can be life-changing: they can have serious psychological and emotional effects on those who have to endure relentless abuse, which is taking its toll on society’s well-being.

I have dedicated my life to the well-being of children, and it is children who are predominantly at risk from online harms. It is accepted that the internet offers children a range of wonderful opportunities to have fun, create, learn, explore and socialise. But tech firms are failing our children, and it seems that they will not take action until they are forced to. They must establish a duty of care for their customers, who want to be empowered to keep themselves and their children as safe online as they are offline. Currently, insufficient support is in place, so many feel vulnerable online.

Last week, I hosted the launch of the Internet Watch Foundation’s annual report. I declare an interest as one of its champions. For the past 23 years, the IWF has taken on what you might call “the toughest job in the world”: removing thousands of child sexual abuse images from the web. Worryingly, it has told me that the extreme content is getting worse and worse. I wept when I heard harrowing stories of how children, including newborn babies, are being sexually abused and then re-victimised by having their image shared across the world online, again and again.

The IWF welcomes the online harms White Paper and its focus on making the UK the safest place in the world online. The paper fits with the IWF’s charitable objectives and vision of an internet free from child sexual abuse. It calls on the Government to recognise the efficiency and success of its work and to ensure the security of the IWF’s partnership approach in the new regulatory framework proposed in the White Paper. But it is concerned that its partnership model could be swept away accidentally and its ability to remove images of child sexual abuse hindered.

When the IWF was founded in 1996, the internet was a vastly different environment. The tech giants of today, including Google, Facebook and Amazon, did not exist. More and more people are now using the internet. The Government, courts and legislative processes are no longer able to keep pace with that change or predict where the future will take us. Therefore, the IWF is calling on the Government to ensure that any legislative proposals and definitions are nimble enough to be adaptable in future, with as wide an application as possible to keep up with the rate of change and innovation in the tech sector. We know that the size, nature and processes of companies within the internet industry are wide and varied, so the Government must work in tandem with the industry to develop a code of practice to effectively address regulation of the internet in a realistic and enforceable manner and recognise that one size does not fit all.

I sit on the House of Lords Communications Committee, and in our latest report, The Internet: to Regulate or Not to Regulate?, we recommended that a new body, which we call the digital authority, should be established to co-ordinate regulators in the digital world and that this body should continually access regulation in the digital world and make recommendations on where additional powers are necessary. We should also establish an internal centre of expertise on digital trends which will help to scan the horizon for emerging risks and gaps in regulation, to help regulators to implement the law effectively and in the public interest. We foresee the digital authority co-ordinating regulators across different sectors and multiple government departments, so we recommend that it should report to the Cabinet Office and be overseen at the highest level.

I always say that childhood lasts a lifetime. As my noble friend Lord Storey said, schools need to educate children about how to use social media responsibly and be safe online, as supported by the PSHE Association. Parents must be empowered to protect their children through digital literacy advice and support because, for most children today, their childhood is being brutally snatched away from them.

This is a pivotal moment, and the rest of the world is watching to see what measures are put in place to regulate the internet. But we must be wise and learn from experience. For example, if the Digital Economy Act were in front of us today, there is no doubt that social media would not be excluded. It is a glaring omission. Will the Minister confirm that the Government will address the exclusion of social media from age verification for commercial pornography at the earliest opportunity? The BBFC should have the power to ensure that an AV wall is in front of all commercial pornography. It makes no sense not to include social media. I believe that the speed of change in this space is such that we will move to a situation where AV is routinely used for a range of content for different ages, and that this is a good thing.

I welcome the recognition in the White Paper of the BBFC’s age ratings online. However, it is vital that age ratings can be linked to parental controls and filters. I also heard from the BBFC that its classification tool for crowd-rating user-generated content, You Rate It, would be perfect for YouTube where, according to Ofcom’s research, many children now view content. It would mean that parents and children could report abuse. Will the Government be prepared to endorse and encourage crowd rating?

Age verification, which I and many other noble Lords across the House have fought for over many years, will finally become operational in July. The legislation and technical innovations to carry out rigorous and secure age verification will soon, I hope, be taken up by other countries across the world. I believe that this will be the same for the measures proposed in the online harms White Paper.

It is wonderful that the DCMS and the Home Office are working together on this important issue, as we need a holistic approach in which other departments, such as Health, Education and the Treasury, are involved. We all have our part to play if we are to counteract the onslaught of online harm.

I urge the Government to concentrate on bringing in a new regulatory framework that can genuinely make the internet a safer place as soon as possible, because every day we hear more horrific stories of online harm. I look forward to working with the Government to progress this important White Paper. Once again, I congratulate them on producing it, because it shows that we intend to be the leading force in the world when it comes to online protection and safety. My Lords, there ain’t no stopping us now!

My Lords, most of the focus in the media and in Parliament about online harms has rightly been on children and young people. However, I suggest that government, social media companies and the new regulator also think about people with learning disabilities and other vulnerable adults. I remind noble Lords that Article 9 of the UNCRPD requires states to enable disabled people to participate fully and to have access on an equal basis; for example, to information and communication technologies.

According to research published by Ofcom this year, about 70% of the 1.5 million people with learning disabilities in the United Kingdom have a smartphone and a laptop or computer. While this is significantly lower than the proportion of the general population, it still indicates that a majority are active online. For those who go online, there will be clear benefits.

Having a learning disability often means that people have fewer friends and fewer opportunities to socialise than the general population. Social media could be an effective way to connect with others and to build friendships and relationships with like-minded people. However, many have not enjoyed these good outcomes but instead have had distressing experiences.

Many have been financially exploited by people who prey on the fact they have an intellectual disability and are less able to spot a scam. Scammers might pose as a business offering a product or service, as a health professional or as an individual offering friendship or a romantic relationship. This type of “befriending” is often referred to as “mate crime” in the disability sector—there is a tragic history of this occurring both online and offline—and is intended to exploit them financially, physically and sexually. This type of online grooming might begin with the inappropriate sharing of images. Some people with a learning disability, particularly those with limited support, find it difficult to recognise that it is inappropriate and dangerous; nor do they know where to seek help.

Such negative experiences may lead people simply to retreat from social media platforms. As part of the online harms White Paper consultation, I suggest that government need to engage directly with people with learning disabilities as well as with the organisations that represent them, such as the Royal Mencap Society, Dimensions and the Foundation for People with Learning Disabilities.

The 2018 digital charter had several principles, one of which was that people should understand the rules that apply to them when they are online. This raises questions about whether some people with learning disabilities do not have the mental capacity to use the internet safely and what measures social media providers may need to take to make the internet safe and inclusive.

In a recent judgment in February, the honourable Mr Justice Cobb in the case Re A (Capacity: Social Media and Internet Use: Best Interests) commented:

“Online abuse of disabled people has become, and is, an issue of considerable and increasing national and international concern”.

He concluded that A, a man with learning disability, must be able to understand that information and images he shared on the internet might be shared more widely, including with people he did not know; that privacy settings might enable him to limit what is shared; that other people might be upset or offended by offensive material that he had shared; that some people he met online might not be who they said they were; and that someone who called themselves a “friend” on social media might not be friendly. He also suggested that some people whom he did not otherwise know might pose a risk to him.

It was a very thoughtful judgment, which concluded that A did not have capacity to use the internet safely. Mr Justice Cobb also made the point:

“The use of the internet and the use of social media are inextricably linked; the internet is the communication platform on which social media operates … It would, in my judgment, be impractical and unnecessary to assess capacity separately in relation to using the internet for social communications as to using it for entertainment, education, relaxation, and/or for gathering information”.

I would add that access to the internet is also needed for, for example, telecare, which is important for many disabled people.

I, too, welcome the proposed “duty of care” for social media companies. This must include reference to vulnerable adults, including those with learning disabilities. Social media companies have powerful algorithms working to clamp down on copyright and other infringements and it makes sense that these should also protect people from abuse, scams and grooming.

The consultation states:

“The regulator will also have broader responsibilities to promote education and awareness-raising about online safety”.

I strongly suggest that central to this is ensuring that people with learning disabilities are also provided with the skills, tools and knowledge to keep themselves safe online as well as to know where to go to report incidents and get the right guidance and support. Only with this education will people be able to understand online safety and be included in this new technology.

I look to the Minister for an assurance that, in creating this new framework, government will include the needs of vulnerable adults in its scope so that social media companies and others will work together to protect people with learning disabilities from abuse, scams and grooming.

As my noble friend Lord Knight mentioned, earlier this year we celebrated 30 years of the internet and the BBC made a series of programmes about its development. One was about how content that spreads knowledge and information has been used to undermine many of the values of our society, so doing the harm that we are debating. This meant that the internet platforms had seriously to think about monitoring content. They could not rely on people reporting harmful content because many had sought out the material on purpose.

We were then shown how the monitoring takes place. There are algorithms looking for harmful phrases, words or images, but apparently these can be easily fooled. Therefore, a major part is human monitoring, and we saw how one of the major platforms does this. It employs hundreds of people in Malaysia and the Philippines to scan posts for things such as decency, child abuse and threats—things that are already illegal on the internet. It was interesting to see the monitors at work. Decisions are instantaneous. Where perhaps the Minister or I would want to give a matter more consideration, there is no time, because monitors have to fulfil their quotas or their pay is docked.

This is the practicality of monitoring the internet. When the duty of care required by the White Paper becomes law, companies and regulators will have to do a lot more of it. The paper suggests that the regulator will be funded by a levy on the companies—they will need it. The Minister assured us that regulations will be reasonable and proportionate. Yes, there will be a code of conduct. I think that the internet companies will welcome this, because it firmly puts the responsibility on government to decide what is and is not acceptable, and where lines should be drawn. The lines may be drawn in different places in different countries, but I agree that we have to make a start.

I agree with the noble Lord, Lord Anderson, and hope that the Government will make an important part of this code of conduct requiring internet platforms to provide information voluntarily to help the authorities find the authors of harmful material. Their identity is often covered by many layers of encryption or by using off-grid servers. Indeed, I presume these regulations will apply only to the open internet. Do the Government also hope to regulate the dark web and private servers?

Of course, there are other ways to achieve the same objective. Like the noble Lord, Lord Kirkhope, I ask the Minister whether we will keep the GDPR rules of the EU. These seem to rely on swingeing fines acting, we hope, as a deterrent, but the size and resources of the European Union are presumably needed to collect such fines from companies 5,000 miles away; however, I am sure they act as a deterrent.

The alternative is to deal with this through some good, old-fashioned anti-monopoly legislation, making sure that customers and users are not being exploited. As we are all locked into using these platforms, are we being exploited by lack of choice, lack of transparency or lack of content? I put it to the Minister that there is a case for this. He will be aware of the growing unease about the concentration of power and control over the internet in a few companies; we all know who they are. Much of this power and control lies in the fact that the same company that provides the platform also provides the content and the goods. Doing both enables a company to dominate trading online, causing harm to many small and medium-sized companies that trade on or off the internet. This is harmful to society too, as explained by my noble friend Lord Griffiths and the noble Lord, Lord McNally, because it causes economic harm.

So, there is a case for commercial harm. Internet platforms recognise this and have recently produced data ethics guides or appointed prestigious advisory councils to look at not only harm but the impact of artificial intelligence. Part of their task also seems to be helping to argue that breaking up the dominant companies will be bad for innovation and progress. I suspect that reducing the harm of these companies by treating them as monopolies is some way off but, in the end, it may become the only effective way of dealing with the harm that concerns us, making the internet a safer place without having to create trusted institutions to handle the data. The promised media literacy strategy can play an important role. Like my noble friend Lord Puttnam, I think it should be high on the agenda to assist us in helping ourselves and our families.

The Government are right to act. As I said, it will require a lot of people and money, but let us not forget old-fashioned monopoly legislation because it may come down to that. The Minister spoke about international co-operation. What steps will the Government take to get others to work with us? I agree that we cannot isolate ourselves from the rest of the online world.

I declare an interest as a series producer at Raw TV making content for CNN. I support many of the suggestions in the White Paper, particularly the need to give a duty of care to tech companies to prevent the harms that appear on their platforms. The Communications Committee’s recent report on regulating the internet stated:

“Given the urgency of the need to address online harms, we believe that in the first instance the remit of Ofcom should be expanded to include responsibility for enforcing the duty of care. Ofcom has experience of surveying digital literacy and … experience in assessing inappropriate content”,

and balancing it against free speech. The new regulator recommended in the White Paper is an exciting idea I fully support, but it will take some time to create and action needs to be taken now.

I was reassured by the Minister’s assurances on free speech at the beginning of the debate but I would still like to draw his attention to the wide range of organisations covered by the regulator under the White Paper. Paragraph 4.2 looks at types of online activity, including “hosting”, “sharing” and “discovery of user-generated content”. My concern is that this definition is so widely drawn that it will cover much user-generated content on the websites of broadcasters and newspapers. As the Minister pointed out, these are already regulated by Ofcom, IMPRESS or IPSO. Some of the UGC is also regulated on these publishers’ websites, particularly those that have gone through a process of editorial control. However, a lot of the other comments and UGC on these websites is not covered and is not being looked at under the regime suggested by the White Paper. I suggest that it should be dealt with by extending the remit of the existing regulators, rather than being duplicated by a new regulator.

I am also concerned by some of the definitions of online harms set out in table one in paragraph 2.2—the noble Lord, Lord Griffiths, talked about them—particularly those under the column entitled “harms with a less clear definition”. I am worried that unless their definition is carefully focused, they will have a chilling effect on free speech by leaving media companies vulnerable to allegations of breaching their duty of care. One such harm is “disinformation”, which we are all against when it covers the dissemination of lies. I fear that, despite the Government’s laudable intention, a wide definition would allow interest groups and individuals being investigated by reporters to disrupt research and undermine the credibility of news organisations with allegations of fake news. For example, we have seen super-complaints against media outlets reporting on the pharmaceutical industry and exposing the side-effects or addictive qualities of certain drugs. The threat of a digital regulator questioning the original journalism and comments from users, who report the side-effects of these drugs, could stop these investigations taking place.

Another term that worries me is “violent content”, which also comes under the column entitled “less clear definition”. This definition must also be drawn very carefully so that it does not censor reports on demonstrations or terrorism. Even if these reports have been carefully edited, there could still be complaints of incitement to or encouragement of violence. For instance, reporting from the Catalan independence referendum showed many shots of the police violently tackling voters to prevent the banned vote going ahead. In this case, a wide definition of “violent content” could be interpreted to cover these images of extreme police action because they incite violence; they might therefore be taken down. I ask the Minister to draw these definitions carefully so as not to chill free speech. It would be ironic if the legislation coming from this White Paper managed to quash valid and important free speech when it should be stopping a much greater harm.

A completely different area of the White Paper, mentioned by the right reverend Prelate the Bishop of St Albans, worries me: the Government’s approach to internet addiction. Paragraph 1.19 of the White Paper states:

“The UK Chief Medical Officers (UK CMOs) commissioned … a systematic evidence review on the impact of social media use on children and young people’s mental health. The review covered … online gaming … and problematic internet use, which is also known as ‘internet addiction’”.

However, paragraph 1.20 states:

“Overall the research did not present evidence of a causal relationship between screen-based activities and mental health problems, but it did find some associations between screen-based activities and … increased risk of anxiety or depression”.

The White Paper concludes that the evidence does not support the need for,

“detailed guidelines for parents or requirements on companies”.

Box 15 suggests that,

“the regulator will continue to support research in this area … and, if necessary, set clear expectations for companies to prevent harm to their users”.

I suggest that the White Paper is kicking the can down the road. Millions of parents in this country will have stories of trying to limit their children’s screen time and the dreadful battles that ensue. Your Lordships only have to read a widely praised book by Shoshana Zuboff, Surveillance Capitalism, to understand that addictiveness is built into many platforms, especially social media sites such as Facebook.

Chapter 8 does look at regulators working with tech companies to enforce safe design in the digital world, but I suggest that the Government should specifically ask the regulators to look at internet addiction more thoroughly and, if necessary, force tech companies to change their algorithms and coding so that this addictiveness is reduced. Obviously, tech companies want to encourage users to spend as much time on their platforms as possible, so it is only through direct intervention by the regulator that anything will be done to combat internet addiction.

There is one area of internet addiction that I am particularly concerned about: internet gaming disorder, a condition which at the moment affects many young people, especially young men. There is great concern among addiction specialists about this problem. It has been difficult comprehensively to diagnose the condition because so many different measures have been used, but next month the World Health Organization assembly will be discussing whether gaming disorder should be included in the International Classification of Diseases. Once that happens, doctors and psychologists are convinced that the terrible extent of this problem will become only too clear.

The Minister has only to talk to players of the game “Fortnite” to understand how very clever the company designers have been in making it addictive. Even when the player stops the game it carries on, and when they join there are endless incentives to keep playing. There have been many cases of young people being severely sleep deprived, refusing to leave their rooms and, in some cases, even becoming suicidal. Policymakers in China and South Korea, where this has been a particular problem, recognise that internet game disorder needs to be dealt with. They have started to combat it by working with parents and by engaging with the gaming companies to build in design that limits the amount of time played and, in some cases, cuts off play after a certain period. The White Paper needs to bring together stakeholders and the gaming industry to draw up new regulations right now to mitigate the problem of gaming addiction, along the lines of what is going on in the Far East. I ask the Minister to ensure that any further legislation takes this problem into account. Millions of parents across the country will be grateful and, in the long term, so will their children.

My Lords, I rarely speak on DCMS issues. I talk about them when they have been taken away from that department and pushed down the line to the Home Office or the health department. I am thinking here of how the licensing legislation for drinking started in DCMS. It ended up with the Home Office and effectively the major interest in the issue now is with health. It is the same with gambling, where again we started with DCMS. I forecast that this major document and the legislation that is to follow it will not stay primarily with DCMS but will go to the Home Office, where the security issues have to be dealt with, but much of it will end up with the health department. Here I am pleased to be following the noble Viscount, Lord Colville, because the areas he has touched on are those in which I have a particular interest. I have gone from drink, drugs, sugar, diabetes and obesity to what people are doing about obesity in children and the failure of parents to watch what their children are doing, including the time that they are spending on the internet, particularly on gaming, and the effect that this is having on family life and so on.

I picked up on the issue of opioids on page 16, while on page 20 mention is made of the report of the UK Chief Medical Officers which looks at the problems that will arise in the future. At the moment the prospective Bill will not look at them. A real challenge is the emergence of the problem of excessive screen time. Children are spending an average of 15 hours a week playing games. Pages 26 and 27 deal with addiction, which was picked up on by the noble Viscount. I am worried that we do not have enough about health in this document. If it is not in there now, it will most definitely come along further down the line.

I welcome the White Paper overall because it pulls together many areas where we have had concerns for quite some time. I welcome the statutory duty of care, but it is a pity that that has been taken from the health and safety regulation. Health has thus been denied its inclusion, and I suggest to the Minister that when we come to rewrite the title we do not just talk about online harms but add a colon and the word “safety” and a tag behind it saying “health”. I think that health will have to be looked at in that context. A new title would be better because we should try to make it look a bit more positive than it does at the moment.

We also have to look for ways in which we can engage better not with the smaller companies in the industry but with the big ones. We should differentiate between the big international monopolistic players and the smaller companies that are trying to make their way and grow. It should not quite be a blunderbuss right across the board; we should split these companies into two categories. We are dealing here with people who have big money and great power.

Referring to the appearance of the noble Baroness, Lady Blackwood, before us earlier, we can see that the Department of Health and Social Care has seized the initiative on organising meetings. The Government will need to have someone who is clearly in charge. I am not sure from which department they will come, but someone has got to be ultimately responsible for dealing with the major companies on an international basis as well as domestically within the UK.

It is good to see that we have the statutory duty of care and that we will use technology as part of the solution. It is good too that the funding for this will, it is hoped, come from the industry. I suggest that we need much more funding than just to cover the regulator. We ought to have a look at what happened when the National Lottery was introduced. By and large it has not been criticised as much as gambling in general over recent years. The National Lottery is acceptable because some of the money raised by the lottery goes back into society. We need to engage with the major players not just about harms but about how we can move them towards taking a positive approach. Perhaps we should be looking at them not just to pay for regulation but to create something like the kinds of additional funds that from the 1990s onwards have been generated and then spent on, for example, heritage projects. That money from the major players should be used to pay for research and to encourage them to explore those areas where technology can be used positively as well as negatively.

My interest in this issue is based on friends with children who are totally addicted to gaming. The problem is quite widespread and is growing, and it needs to be seriously addressed. If you can get children addicted to gaming, why are we not looking at whether gaming can be used as a means to attract the attention of children who perhaps have mental health problems or physical problems with obesity and so on? We should try to develop positive games that encourage them to care for themselves rather than simply persuading them to buy the next game, which is what so many of the gaming exercises are about at the moment: making money. There is an opportunity to engage with the major players and try to move them in that direction. They could still make money but they would be doing so on something that is worth while for the populace generally, and in particular for the welfare of our children in the future.

I am working with a group of people in a television company to try to do this. We have identified many of the games currently being played which are good, but they are in the minority. I have tabled a Question for a couple of weeks’ time asking the Government what they are doing in terms of research into gaming on the positive side rather than the negative one. I hope that the Minister, who is going to be answering that Question, is prepared to come up with some positive responses. Money needs to go into this and if the Government will not fund it, we certainly ought to be going to the private sector, perhaps in conjunction or in partnership with the Government. We could then develop a positive approach to the good elements of technology rather than spending all our time talking about the negative ones.

I am unhappy about the absence of the health elements, which I believe will come in due course, without question, as night follows day. We should be preparing for that, and perhaps the Minister might reflect on whether a little more should be included on the problems coming on the health side. They would then already be there and, even if not addressed in the Bill that comes, would at least have been laid down to be reviewed and worked on in the future.

I hope the Minister might look also at the possibility that more money is taken on a persuasive basis from the big players, so that we do not just cover the cost of the regulator but start to invest in research in those areas where games, for example, could be used positively for the benefit of children, rather than negatively.

My Lords, it is excellent to follow the noble Lord, Lord Brooke, because I have worked with him on areas of addiction. I know of his campaigning in this area, and I admire and follow with interest his constant insistence on connecting it to health. I also thank the Minister for providing us with this debate. As the noble Lord, Lord Griffiths, rightly described it, it has been a good opportunity to have a fascinating conversation.

Every noble Lord has said that the White Paper is very welcome. To date, the internet, and social media in particular, have opened up huge freedoms for individuals. But this has come with too high a societal price tag, particularly for children and the vulnerable, as described by the noble Baroness, Lady Hollins. There is too much illegal content and activity on social media, including abuse, hate crimes and fraud, which has mostly gone unpoliced. As my noble friends Lord McNally, Lady Benjamin and Lord Storey said, we on these Benches therefore support the placing of a statutory duty of care on social media companies, with independent regulation to enforce its delivery. My noble friend Lady Benjamin was quite right to say that she was seated at this table a long time before many of us. The independent regulator could be something like the Office for Internet Safety, or, as described by the Communications Committee, the digital authority.

The evidence has been clear from the DCMS Select Committee, the Lords Select Committee on Communications, Doteveryone, 5Rights and the Carnegie Trust: they have all identified the regulatory gap that currently exists. A statutory duty of care would protect the safety of the user and, at the same time, respect the right to free speech, allowing for a flexible but secure environment for users. We agree that the new arrangements should apply to any sites that, in the words of the White Paper,

“allow users to share or discover user-generated content or interact with each other online”.

The flow between regulated or self-regulated providers of information and providers of platforms of unfiltered content is not something that your average teenage user of “Insta”, as they call Instagram, can distinguish—by the way, it is never Twitter they use; that is for “old people”. These Insta-teens do not distinguish between a regulated, substantiated information provider and inaccurate and harmful content or links. The noble Lord, Lord Puttnam, talked about digital literacy, which is absolutely essential. One of the greatest gifts we can give a new generation of children is the ability to question the content that is coming to them. Proper enforcement of existing laws, as mentioned by the noble Lord, Lord Anderson, is vital to protect users from harm. But the useful addition is that social media companies should have a statutory duty.

My noble friend Lord Clement-Jones so ably chaired the Select Committee report on artificial intelligence, Ready, Willing and Able?; a report that some of us talked about only last week. A year later, it is still keeping us all very busy with speaking engagements, and therefore my noble friend is very sorry that he cannot be here. He is currently in Dubai at the AI Everything conference to talk about AI and ethics. When the White Paper was published, he rightly said:

“It is good that the Government recognise the dangers that exist online and the inadequacy of current protections. However, regulation and enforcement must be based on clear evidence of well-defined harm, and must respect the rights to privacy and free expression of those who use social media legally and responsibly”.—[Official Report, 8/4/19; col. 431.]

He welcomed the Government’s stated commitment to these two aspects. The essential balance required was described by my noble friend Lord McNally, the noble Lord, Lord Kirkhope, and the noble Viscount, Lord Colville.

Parliament and Government have an essential role to play in defining that duty clearly. We cannot any longer leave it to the big tech firms such as Facebook and Twitter, as we heard from the noble Lord, Lord Haskel. We have been waiting and waiting for it to be done on a voluntary basis, and it is simply not good enough. It was therefore good to hear the Statement earlier today, on yesterday’s emergency summit, about self-harm and the commitment from some of the big tech firms to provide the Samaritans with support. However, the right reverend Prelate, my noble friend Lord Storey and the noble Baroness, Lady Thornton, were right about the need to follow the money and look at the level of investment versus the level of profit. I hope that the Minister will respond on that.

I want to explore in particular the use of regulators that currently exist. Our findings on these Benches, following a series of meetings with the main regulators and after hearing evidence, is that they are keen to get started on some of these areas. While I appreciate that we are still in a period of consultation, I would like to explore this issue, because the need to deliver soon for the whole current generation is significant.

Does the Minister agree that it may well be possible for the extension of regulatory powers to Ofcom to oversee the newly created duty of care? Does he agree that Ofcom could, in principle, be given the powers and appropriate resources to become the regulator that oversees a code for harmful social media content, and the platforms which curate that content, to prevent online harms under the duty? As the noble Viscount, Lord Colville, asked, what are the possibilities for the use of current regulators? Ofcom’s paper on this very issue, published last September, was very helpful in this respect. We heard from my noble friend Lord McNally about the success of the Communications Act 2003, and the scepticism beforehand about its ability to deliver. It runs in complete parallel to what is currently being debated about how it can apply to the internet—so it has been done before.

Likewise, how does the Minister view new powers for the Information Commissioner and the Electoral Commission, particularly in respect of the use of algorithms, explainability, transparency and micro-targeting? I apologise that I cannot provide more detail—I cannot seem to get on the internet here today, which is ironic—but there was a recent fascinating TED talk about the suppression of voting. It was about not just the impact on voting but trying to suppress voter turnout, which I find horrific. What are the possibilities for the ICO and the Electoral Commission to expand and take up some of these duties?

The White Paper refers to the need for global co-operation, and my noble friend Lord Clement-Jones is pursuing the possibility of the G20 in Osaka being used as a key platform for an ethical approach to AI. Is it possible that the White Paper agenda could be included in this? In particular, it is about using the recommendations on ethical principles from the AI Select Committee, and the five principles for socially good AI from the European Commission High-Level Expert Group. What are the possibilities around that, given that we are trying to push for global agreement on responsible use of the internet?

The noble Lord, Lord Knight, mentioned transparency. There must be transparency around the reasons for decisions and any enforcement action, whether by social media companies or regulators. Users must have the ability to challenge a platform’s decision to ban them or remove their content. I very much welcome the fact that technology has been highlighted as something that is part of the solution.

For children, “not online” is simply not an option. Children at secondary school now have to use it to do their homework. It is no good me saying to my 13 year-old, “Get off your screen”, because he just might be on “Bitesize” or doing his maths. I have to get my kids’ meals paid for on this, so online is very much part of a child’s life. Screen-based activity could mean that they are doing their homework—fingers crossed.

However, I completely agree with what was said about the resistance of the gaming sector, in particular, to engage with this issue, and I support the noble Viscount, Lord Colville, on this. But my noble friend Lord McNally rightly pointed out our limitations generationally. Fair warning: I think that sitting through a popular vlogger on YouTube with 2 million subscribers describing the “Endgame” version of Fortnite to us would not enlighten us as legislators. It is therefore about us getting it right.

The noble Lord, Lord Anderson, said that we have been lucky. What I fear and worry about most of all is that today’s generation is not going to get the value of this White Paper, and that is particularly unlucky. Therefore, to get the balance right, as my noble friend Lord McNally rightly said, we should be considered in our approach. But I worry about the need to deliver quickly, and that is why I am asking whether there are regulators who can possibly trial some of this legislation in advance, rather than it going through pre-legislative scrutiny—because we know that some of them are ready to deliver on some of this.

I assume that when seat belts were originally discussed in a place like this, there were still some people who wanted children to bob about on the back seat in the name of freedom, without laws to protect them. Today that would be horrific and unheard of. We will probably look back and feel the same about children online. My noble friend Lord Storey is absolutely right: a child can find their way around the age issue. Give me a handful of 11 and 12 year-olds and I will show you the social media apps they are currently on. All of them are restricted to 13—but they are all on them. That is why the age verification issue must be tied in with younger people in particular, as was mentioned by the noble Baroness, Lady Howe, and my noble friend Lady Benjamin. In order to double or triple check, I ask the Minister whether it is being delivered in July. I thank him for his thumbs up; it is very well taken.

As I have said, we have a moral duty to move at quite a pace now. The White Paper is extremely welcome and I look forward to supporting its rapid progress through this House.

My Lords, I join others in thanking the Government for ensuring that the House has had an early opportunity to debate this White Paper. It has been long-trailed—it kept approaching and disappearing in our thoughts as the Minister came under pressure to define his timescale—but it is here, it is good and we will support it. However, it has also brought up a number of issues that have been raised today and we need to address them.

The number of speakers in the debate may be relatively low but the quality of the content has been extremely high. I have been scribbling notes all the way through, often overwriting what I was going to say as additional points came through. I will probably not be as clear as I would wish to be but that is a reflection of the quality on display today.

We also had the chance to see practical examples of the issues in play in the exchanges on the Statement that preceded this debate. Some concrete examples were quite worrisome and I hope they will be looked at carefully by DCMS, even though the Statement was from the Department for Health and Social Care.

It would be invidious to pick out particular contributions to the debate—as I have said, the standard has been high—but it would be remiss of me not to pay tribute to the noble Baroness, Lady Howe, for her contribution. She has been a doughty campaigner on these issues for as long as anyone can remember—she can remember a long way back; I mean no disrespect by saying that—and it must be a sweet moment for the Minister that, despite the criticisms she still has, she welcomed what has been put in front of us today.

We are not discussing a Bill and I take the Minister’s point that this is a White Paper for discussion. It has some green pages to which we are encouraged to respond, and I hope we will all respond where we can. I also hope that the Minister will take on board what has been said today because it has been a useful contribution. Many people have spoken about the wording of the paper itself, which gives a sense of where we are in this debate. I shall do so as well. I have some general points that I wish to make at the end of what I have to say, but I shall start with one or two points of detail because it is important that we pick up on issues of substance.

On the statement in the White Paper on a new regulatory framework for online safety, in paragraph 21 there is an assertion that the Bill will contain powers for the Government to direct the regulator, when appointed, in relation to codes of practice on terrorist activity or child sexual exploitation and abuse—CSEA—and that these codes must be signed off by the Home Secretary. This is an issue in which Parliament needs to be involved and I hope the Minister will reflect on that and find a way in which we can get further engagement. I do not think it appropriate for the Executive simply to commission codes, have the Home Secretary sign them off and implement them without Parliament having a much greater role.

Paragraph 22 refers to the need to make sure that the codes of practice relate to currently illegal harms, of which there are many, including violence and the sale of illegal services and goods such as weapons. The clear expectation is that the regulator will work with law enforcement to ensure that the codes keep pace with the threat. This also is a wider issue because obscenity law is also in need of updating. We have had discussions on previous Bills about how there is discontinuity in how the Government are going about this. I hope that point will also be picked up.

A number of noble Lords raised the importance of transparency for any work that might be done in this area. The most disappointing aspect is the rather coy phrasing in the White Paper in relation to algorithms. Paragraph 23 refers only to the regulator having powers to require additional information about the impact of algorithms in selecting content for users. The bulk of the argument that has been made today is that we need to know what these algorithms are doing and what they will make happen in relation to people’s data and how the information provided will be used. This issue came up in the discussion on the Statement, when it was quite clear that those who might be at risk from harms created on social media are also receiving increasingly complex and abusive approaches because of algorithms rather than general activity. This issue is important and we will need to come back to it.

Moving on to the companies in the scope of the regulatory framework, the phrasing of paragraph 29 is interesting. It states:

“The regulatory framework should apply to companies that allow users to share or discover user-generated content or interact with each other online”.

That does not cover the point that, as many others have said, a much wider set of societal and economic indicators will be affected by the work on social media. We cannot allow the opportunity to legislate here to be missed because of some constraint on looking only at user-to-user interactions. We need to consider the impact on the economy more broadly.

When the Minister responds, or perhaps in writing later, will he consider the question raised in paragraph 33, which states:

“Reflecting the importance of privacy, any requirements to scan or monitor content for tightly defined categories of illegal content will not apply to private channels”?

We need to know more about what is meant by “private channels”. There is more in the White Paper but this exclusion of private communications may be too great a risk to bear. If we are talking about WhatsApp or Facebook Messenger messages being private, we will also miss out on the problems that have been caused by harassment, bullying, aggression and other issues raised in earlier debates.

On the independent regulator, which I shall come back to later, there is a very narrow issue about the wording of paragraph 35, which says that,

“the regulator will work closely with UK Research and Innovation (UKRI)”.

Why has that body been picked? There must be many people doing research in this area and it would seem invidious that it has been selected as one of the primary partners on the evidence base. I hope there is a much broader cut through the research being done because we will need it as we move forward.

Finally on the detailed points, the enforcement of the regulatory framework is key to whether this will be a successful démarche. On all the previous occasions we have discussed this, in relation to gambling, addiction and other issues, we have come across the problem that where companies have a legal presence in the UK, there is obviously an easier route through to attaching to them. However, most companies operating in the UK are based entirely overseas, and this is true of the companies we are talking about today. It is a familiar problem. We have been through this so many times that the arguments must be so well rehearsed in the department that it has not been able to come up with anything new this time, although I regret that because we are stuck with the issue that, while it is very good to see the Government prepared to impose liability on individual members of senior management in respect of breaches of the regulations implied by the new regulator, the business activities will not be affected if the Government lack the powers to do anything about them. The Minister is well aware that in previous discussions we have come to the conclusion that the only real way in which one can get at that is to follow the money. Unless there are powers in the Bill, when it comes forward, to block non-compliant services, and particularly to stop the flow of money, it will not be effective. I hope that message will be learned and taken forward.

The noble Lord, Lord Anderson of Ipswich, raised an important point about the fit with the EU e-commerce directive. I am sure the answer to this is that it cannot be answered, but the issue is clearly there. The e-commerce directive constrains the Government’s ability in this area. Unless they have a way forward on this, we will not be able to get far beyond the boundaries. I will be grateful for any thoughts that the Minister might have on that.

On general points, the right reverend Prelate the Bishop of St Albans was right to pick as his analogy the parallel between the internet and open spaces, and how we are happy to regulate to make sure that open spaces are available and accessible to people. We should think hard about that helpful analogy in relation to the internet. I am also very grateful to my noble friend Lord Knight of Weymouth, one of the few people to point out that we all believe that the sunny uplands of the internet—the safe places in which we gambol and play—have always been a fantastic resource for those able to access and use them. Of course there are dangers, and it has been a bit of a Wild West, but we have undoubtedly benefited from the internet. We must be very careful that we do not lose something of value as we go forward.

I take it from what the White Paper says that it is now clear that there is sufficient evidence from authoritative sources of the harms caused by social media to justify statutory action. Indeed, the White Paper accepts that voluntary self-regulation in this area has failed. I think that is right. However, we need to bear in mind that there is a lot going on. For example, we are still waiting for the Law Commission to finalise its review of the current law on abusive and offensive online communications and of what action might be required by Parliament to rectify weaknesses in the current regime. From earlier discussions and debates, I also anticipate that more legislation will be required to eliminate overlapping offences and the ambiguity of terminology concerning what is or is not obscene. I hope we will have a clear view of what is or is not illegal in the virtual world. It is easy to say that what is illegal in the real world should be illegal in the virtual world, but we now know enough to anticipate that changes will be required to get our statute book in the right order. However, if it is clear what is illegal and can be prosecuted, am I right in thinking that the problem is about how to systematise the drafting of effective legislation for those affected by fast-moving, innovative services on the internet? The software of social media services changes every week, perhaps even more often—every day—and, as many have said, it will be very difficult to find the right balance between innovation, freedom of speech and expression, privacy and the harms that have been caused.

We come back, then, to the very basic question: how do we regulate an innovative and fast-moving sector, largely headquartered outside the UK, and what tools do we have available to do it with? It is true that the technologies in use today represent only 10% of what is likely to be introduced in the next decade or so. How do we future-proof our regulatory structures? That is why the idea of a duty of care is so attractive. Like my noble friend Lord Knight, I acknowledge the work of the Carnegie UK Trust on this, in particular that of Will Perrin and Lorna Woods. There is an earlier legal principle in play here: the precautionary principle that came out in the late 1990s. Its strength lies in requiring a joint approach to as yet unknown risks and places the companies offering such services in the forefront of efforts to limit the harms caused by products and services that threaten public health and safety, but always in partnership with the regulator, to make this public space as safe as the physical space, as the analogy would run.

We support the Government’s proposals for primary legislation to place a duty of care on the social media companies to prevent reasonably foreseeable harm befalling customers or users and to build in a degree of future-proofing that encompasses the remarkable breadth of activity that one finds on these social networks. Having said that, it is important that we think hard about the regulator. This is the point I wanted to come back to. Under a duty-of-care approach, a regulator does not merely fine or sanction but plays an active role to help companies help themselves. It would be perverse not to utilise, for example, the experience and expertise of Ofcom in these earlier stages because it already has a relationship with so many of these companies. I hope that the lessons learned by the Health and Safety Executive over the years will also be tapped because there are other examples, which we will come to.

A few detailed points raised in the debate should be at the forefront in the Minister’s summing up. One is that we do not know enough about the practicalities of physical human monitoring—a point raised by my noble friend Lord Haskel. Here, transparency must be the key. Do we really know what goes on in what we do? If it is all done by automated screenings and robotics, and there is a limit on physical human activity, we will never get to the point where we can rely on companies sufficiently. This is an important area, and of course this is before we start raising issues about the dark web, as my noble friend did.

As others mentioned, we are still not clear about what the real issues are between harmful and illegal content, particularly the contextual issues raised about questions of harm. Clearly, as raised by the noble Viscount, Lord Colville, there is the danger of a chilling effect on innovation and development, and I hope that will be borne in mind. We also have to think about the economic and social disruptions. These activities may well be social in terms of social media but their impact on the whole of society is very important and we need to make sure that the rules and regulations are in place for that.

With regard to the regulator, there is also the question of what other regulatory functions there should be. When we get to the proposed Bill, we will need to spend some time exploring the boundaries between the ICO and the new regulator, and if it is a new regulator, how that boundary will work with Ofcom. I am sure that point will come up later, so it may not need a response today.

A number of noble Lords mentioned addiction and I have a lot of sympathy with that. I do not think that we have really got to the bottom of the issues here. Addiction to gambling is pretty well known about but gaming is becoming increasingly common in discussions about addiction, and the noble Viscount was right to raise it. There is not much in the White Paper about the research, development and educational work around all this activity. Perhaps the Bill will contain more about those issues once further development and discussions have taken place.

As my noble friend Lord Puttnam said, research on its own, and support for education about the technologies, is not really what we are about here. Both he and my noble friend Lord Knight pointed out that knowledge about the technology does not get you to the point where you understand what the information that you lack is doing to your perception of the world and your views about how the world is going. We need to educate and train people and offer them support, whether they are vulnerable or not, so that they can realise when the facts have been distorted and what they think is true is in fact misinformation. That is a completely different approach and I hope the Minister will have something to say about it when he responds.

This is such an interesting and complex area that we should spend more time on it than has been available to us thus far. The idea of pre-legislative scrutiny of the Bill, and certainly more discussion and debate, is attractive. I hope it finds favour with the Minister.

My Lords, I genuinely thank all noble Lords for their contributions. I echo what the noble Lord, Lord Stevenson, said about the quality of the speeches. There is much to say and I will do the best I can to be clear.

I again make the point that this is not a Second Reading debate. I am not here to defend every word in the document. We are approaching this issue in a genuinely consultative way, as I think we have done from the publication of the Green Paper onwards. However, there is one thing that we are not prepared to compromise on: we do not think that the status quo is acceptable, and we believe that the public support us in that.

We are interested in people’s views and the consultation is taking place at the moment. As I said, there have already been over 1,000 responses. There tends to be an initial barrage of responses. They then tail off a bit, and the more considered ones, with the benefit of research, come at the end. Therefore, we think that there will be a significant amount of consultation. We intend to undertake research during and after that period, based on the consultation, and I, along with my officials, will be very willing to talk to noble Lords about this issue outside the Chamber.

Regarding the potential chilling effect on SMEs of the proposed legislation, I would like to say something about the DCMS. Its responsibilities have grown enormously. We now represent sectors that produce one in every £7-worth of the goods and services produced in this country. We are absolutely concerned with and supportive of innovation and growth. Although we think that this regulation is necessary, we are very concerned that it should be proportionate and risk based so that it does not in any way stop the engine of growth that has taken place over the last few years, particularly in the digital sector, where the growth has been significantly higher than that of the economy. We are undoubtedly a world leader in that respect.

The DCMS also represents culture and the media, so we are concerned with our liberal democratic culture, freedom of expression and the press. We therefore have to achieve a difficult balance. It is interesting that both ends of the continuum have been expressed tonight—that is, noble Lords have alluded to the fact that this is a broad-ranging document but some have said that it does not cover a number of pet harms that they are interested in. Achieving the aims will be difficult but absolutely possible. I will come on to talk about how the harms relate to the duty of care, which I hope will be reassuring. I reiterate that, in replying to the individual points made by noble Lords, I guarantee to take them back to the department and think about them, and I will write to noble Lords if I do not get to the end.

Although this is an important part of the battle against internet and online harms, it is also part of a wider mission that we are undertaking. We want to develop rules and norms for the internet, including for protecting personal data, supporting competition in digital markets and promoting responsible digital design. That is why, on page 31 of the White Paper, we have specifically indicated the areas that we are excluding: areas that are either regulated elsewhere or addressed by other parts of the Government’s activities. This may or may not end up with DCMS, as the noble Lord, Lord Brooke, predicted. That these online harms are addressed is more important than where they end up residing.

I return to the list of harms on page 31 of the White Paper. The noble Lord, Lord Griffiths, contrasted it with the harms outlined in the Plum report. That was commissioned by my department as part of the evidence that will support the online advertising review announced by the Secretary of State earlier this year, as well as the Government’s response to the Cairncross review. These lists were therefore produced for slightly different purposes.

Generally speaking, we know that the list of harms in the White Paper will not incorporate every harm that every person is interested in, or that exist on the internet. We want in the duty of care to tell internet and tech companies that they can no longer say, “This is not my problem”. They will have to look at the harms and will have an active duty to educate themselves about the potential harms that their website or app, for example, produces. Even if these are not delineated, it will not be an excuse for a company to say that they are not on the list. We could have had a list of harms that we thought encompassed everything, but that would have been guaranteed to be out of date in three nanoseconds. The duty of care is there to futureproof this legislation as much as possible.

As I said, we have not included harms that have already been dealt with by other initiatives. I say to the noble Lord, Lord Haskel, for example, that we are not covering the dark web; that is dealt with under a separate programme by the Home Office. Where I do agree with him is that competition law itself will need to be looked at, just as big companies in the past have been addressed by it. We will not do it in this White Paper but, as he will know, the Furman report on digital competition outlined that there is insufficient competition in the digital economy. We will be responding to that soon. The noble Lord also asked about international co-operation and what steps we are taking. During the period between the consultation and our response to it, we will be looking at a concerted effort—a programme, as it were—on international co-operation. We agree that it is important, so we will not do it on a piecemeal basis but will try for a proper strategy. That is one piece of work that must be done.

The noble Lord, Lord Colville, talked about the need for a focused definition so as not to inhibit free speech. We are absolutely focused on that; we believe in it. The regulator will issue codes of practice setting out clearly what companies need to do. If the evidence changes and new harms are manifest, the regulator can react and issue guidance but we will have to make sure the legislation itself is very clear about free speech. We are giving the regulator a duty to have regard to privacy and people’s rights under, for example, the GDPR. That will be absolutely within the regulator’s remit.

The noble Lord, Lord Brooke, talked about health. We will take on board his suggested title for the new legislation. We are worried about health too, so my department has worked very closely with the Department of Health and Social Care. As noble Lords know, the ex-Secretary of State for DCMS is now running that department and speaks frequently on these matters—in fact, he did so today. We have cited the Chief Medical Officer’s advice on screen time and included advocacy of self-harm among the list of harms. We take these issues on board. One of the features we have incorporated in the White Paper is safety by design. [Interruption.] I apologise—the digital part of my portfolio is intruding on me. Safety by design means that all harms, including those related to health, are included, if it is reasonable to take account of them.

The noble Lord, Lord McNally, and the noble Baroness, Lady Benjamin, wondered if we have the flexibility and nimbleness to stay ahead of technology and regulate effectively. We will establish a regulator that will have the skills and resources needed to issue guidance on a range of harms. I take on board everything that noble Lords have said about resources and I will come to that later. We will consider the case for pre-legislative scrutiny, but I must say that at the moment—this is not a commitment or an indication of official policy—we are also very conscious of the need to act quickly. We have consulted on the Green Paper and we are consulting on the White Paper. We are thinking about pre-legislative scrutiny—I know the noble Lord, Lord Puttnam, is an expert on that—but we have not made a decision on it. Whatever happens, there will be plenty of consultation with noble Lords.

We agree with the other point made by the noble Lord, Lord McNally, about coherence across Whitehall. There is a need for coherence on regulatory functions and between departments. We are consulting on who the regulator should be and I take on board noble Lords’ views on that. The departmental lead is DCMS, but it is a joint White Paper, so the Home Office is taking a keen interest in this. As I said before, at the moment there is no prospect of us changing that and I think we are well placed in terms of both knowledge and enthusiasm to drive this forward. I have been told that the Secretary of State has made a good impression so far with his advocacy of this White Paper.

The noble Lord, Lord Anderson, spoke of the need for government to declare boundaries for companies to adhere to, and said that there is currently a democratic deficit, with large, foreign companies often setting the rules. My noble friend Lord Kirkhope also mentioned this. In the White Paper, we are consulting on the role of Parliament in relation to the regulator and, in particular, to the codes of practice it will issue. As I said, we will not provide a rigid definition of all the harms in scope, but we will ask how far Parliament should be involved in the individual codes of practice and to what extent the regulator should be accountable to Parliament—in the way that Ofcom is, for example. We are very supportive of that.

On the regulator, I know that some noble Lords have suggested Ofcom. Obviously, we are consulting on whether we want a new regulator from scratch, an existing regulator or a combination of the two. Obviously, I agree that Ofcom would be a strong candidate if an existing body is chosen, and the White Paper recognises that.

The noble Baroness, Lady Grender, mentioned AI. We mention it vis-à-vis transparency. The regulator will have the power to ask what the impact is, as the noble Lord, Lord Stevenson, said. I take his point about the further need to look at AI and some of the issues surrounding it. We would be interested to wait; it will certainly come in time. It is one of the first areas that the Centre for Data Ethics and Innovation is looking at, so we would be interested to hear what it says about it.

My noble friend—sorry, the right reverend Prelate the Bishop of St Albans, who is of course a friend because for some reason we seem to see quite a lot of each other on various issues—talked about gambling, as did the noble Viscount, Lord Colville, and particularly about addiction. The right reverend Prelate mentioned that the regulator needs significant powers and independence to deal with some of the largest companies in the world. He asked if it could be envisaged that some companies could have their licences revoked. That is exactly one of the questions we have asked in the consultation, along with other significant powers of blocking sites and business interruption. So within our suggestions we are talking about pretty draconian powers, but they will be proportionate.

For example, the right reverend Prelate mentioned that the maximum fine at the moment has been £500,000. That is because that was the limit that the regulator—the ICO in this particular case—had. If we follow the GDPR’s lead, it would be 4% of global turnover. Facebook had a turnover of $55 billion, so the fine could potentially go up from £500,000 to $2.2 billion. More important than that is the other suggestion we made about possible personal liability for senior executives and some of the other things I mentioned. We are absolutely conscious that enforcement is a crucial issue in setting up an effective regulator, particularly when so many of these companies are largely based abroad. Another thing we could consider is personal representation in this country, as mentioned in the GDPR.

As far as gambling itself is concerned, we have also tried to avoid duplication, so we are talking about not gambling specifically but of course, as I mentioned before, harms generally. Internet addiction will definitely be in the White Paper’s scope.

My noble friend Lord Kirkhope talked about self-regulation, which he disagreed with. We agree that self-regulation has not worked. It is a good start, and we would expect the regulator to work closely with companies and organisations such as the Internet Watch Foundation in producing its codes of practice. The regulator will wish to learn from these organisations. As I said right at the beginning, we think self-regulation has not worked sufficiently. That is why we have decided to establish an independent regulator.

The noble Lords, Lord Puttnam and Lord Knight, both talked about the democratic issue and electoral interference. We talk about disinformation in the White Paper. That is clearly in scope. Specifically electoral matters will be left to the Cabinet Office, which will soon publish a report on what it is going to do. Indeed, I believe that my noble friend Lord Young is answering a Question for the Cabinet Office tomorrow about that exact issue. I mention that merely to give noble Lords the chance to ask him.

Briefly, because I have not got much time, I will talk about a very important point which many noble Peers have mentioned, and that is the media literacy strategy. We understand that regulation is one thing, but making people aware of what is needed in the modern world is very important. We have committed to developing a media literacy strategy, including major digital players, broadcast and news media organisations, education sector researchers and civil society, to ensure a co-ordinated and strategic approach to online media literacy, education and awareness for children, young people and adults. We want to enable users to be more resilient in dealing with misinformation and disinformation—including in relation to democratic processes—ensure people with disabilities are not excluded from digital literacy education and support, and develop media literacy approaches to tackling violence against women and girls.

I am running out of time, but I want to be very clear about disabilities to the noble Baroness, Lady Hollins. We will be considering those. I will take back what she has said in detail, absolutely take it on board and definitely consult.

Finally, I was very pleased at and grateful for the support of my noble friend Lady Howe of Idlicote. As her speech went on, I was waiting for the “but”, and it sort of came. We agree that filters can be very useful for parents. The online media literacy strategy will ensure a co-ordinated and strategic approach. It will be developed in broad consultation with all stakeholders. As far as the online age verification is concerned—which I can confirm will come in on 15 July—I know that there are issues, which she has discussed both in the Digital Economy Bill and also individually with me. We have decided that a review will take place, so we are not going to be including this, but I absolutely take on board the points she has made and will ensure that they are taken back.

There are a number of other points. I will write to noble Lords, as there are too many to mention. There are those—the noble Lord, Lord Storey, mentioned some of them—who say that because the internet is global, no nation can regulate it. If we have a strong regulator with a sensibly defined legislation that follows the money, as the noble Lord said, then I do not agree; I think it can be regulated. We will do our best to ensure international support with that. We are well placed to be the first to act on this, and to develop a system of regulation that the world will want to emulate. The White Paper begins that process and delivers that, and I commend it to the House.

Motion agreed.