Motion to Take Note
That this House takes note of the role played by social media and online platforms as news and content publishers.
My Lords, it is a great privilege to open a debate with such a broad range of informed speakers to follow. The question in front of us produces a number of interrelated and extremely important issues. I shall not attempt to cover them all but, instead, simply to set the scene for the detailed contributions that are to follow.
The interface between humans and information, be it visual, graphic, moving images, sound or text, is as long as our history. Our understanding of what to expect from those interactions is seen through the prism of technological innovations, cultural understanding and legal frameworks. It is encapsulated by the concepts of broadcast and publishing.
In this long history, the online service providers are an anomaly. The military and academic labs where the web originated were home to groups of skilled and active participants in an open web who saw the potential of decentralised networked computers as liberating and democratising. This was a physical network; these were academics and computer scientists bound by cables, not commerce. They did not consider themselves publishers, nor responsible for the content of others.
This view was almost immediately contested and overturned by early court judgments, but founders of the nascent platform successfully fought back. Citing the First Amendment, they insisted that their network of small networks had no controlling force and that the occasional misuse or obscenity was a small price to pay for a world with no gatekeepers.
The US “safe harbor” provisions in Section 230 of the Communications Decency Act 1996 allowed online service providers to host, hold and transfer information with no liability for content. This principle was mirrored around the world, including in the e-commerce directive of 2000 that codified online service providers as “mere conduits”. This was Web 1.0.
Much of the internet’s utopian promise came true. But what nobody anticipated, including its founders, was how rapidly it would become highly commercialised. Ironically, the “safe harbor” provisions of Section 230, established to protect the common good from a few dissonant voices, now work against that common good. Those who publish online are incentivised to categorise themselves as online service providers in order to benefit from having no liability for content. It is a commercial advantage that has seen the exponential rise of a vanishingly small number of companies with unparalleled power, no collective oversight and unlimited piles of cash. This is Web 2.0, and it is in that context that we are having our debate.
Amazon has set up a movie studio. Facebook has earmarked $1 billion to commission original content this year. YouTube has fully equipped studios in eight countries. The Twitter Moments strand exists to,
“organize and present compelling content”.
Apple reviews every app submitted to its store,
“based on a set of technical, content, and design criteria”.
By any other frame of reference, this commissioning, editing and curating is for broadcasting or publishing.
In giving evidence to the Communications Committee on 19 December, representatives of Facebook and Google agreed that the vast proportion of their income comes from advertising—87% and 98% respectively. This advertising is embedded in, pops up in between and floats across the content that their users engage with. Sir Martin Sorrell, chief executive of WPP, was clear what that means when he said that,
“Google, Facebook and others are media companies … They cannot masquerade as technology companies, particularly when they place advertisements”.
In common with publishers and broadcasters, these companies use editorial content as bait for advertising. They aggregate and spread the news, and provide data points and key words: behaviours that determine what is most important, how widely it should be viewed and by whom. In common with news publishers, they offer a curated view of what is going on in the world.
The Silicon Valley companies are content creators, aggregators, editors, information cataloguers, broadcasters and publishers. Indeed, severally and together they publish far more media than any other publisher in any other context—but, in claiming to be “mere conduits”, they are ducking the responsibilities that the rest of the media ecosystem is charged with.
The media is understood to be a matter of huge public and social interest because it affects common values, certain freedoms and individual rights. For the same set of reasons, it is subject to a complex matrix of regulatory and legal frameworks. But publishing and, by extension, broadcasting are not only legal and commercial constructs but cultural constructs with operating norms that reflect a long history of societal values and expectations, one of which is that those involved are responsible for content. They are responsible because, traditionally, they make large sums of money; they are responsible because they juggle those commercial interests with editorial interests; they are responsible because, within those editorial interests, they are expected to balance freedom of expression against the vulnerabilities, sensitivities and rights of the individual; and they are responsible because they are a controlling force over the veracity, availability and quality of information that is central to the outcome of our collective civic life.
In November, there was an outcry after a journalist reported that algorithms were auto-suggesting horrific videos to young users of YouTube Kids. Google’s response was not proactively to look at the content on its kids’ channel but to ask users to flag content, thereby leaving it to pre-schoolers to police the platform. Google did not dispute that the videos were disturbing or that the channel would be better off without them, but in its determination to uphold the fallacy of being a “mere conduit”, it was prepared to outsource its responsibilities to children as young as four and five.
Whatever the protestations, this is not a question of free speech; it is a question of money. The Google representative giving evidence to the Communications Committee said that to moderate all content on YouTube would take a workforce 180,000 people. Irrespective of the veracity of that statement, for a publisher or broadcaster, checking that your content is safe for children is not an optional extra; it is a price of doing business, a cost before profit. In October last year, Google’s parent company, Alphabet, was worth $700 billion.
I am not suggesting a return to a pre-tech era; nor am I advocating censorship. The media environment has never been, and hopefully will never be, home to a homogenous worldview. Nor should one romanticise its ability to “do the right thing”. It is a changing and fraught public space in which standards and taste are hotly contested and often crushingly low. But editorial standards and oversight, retraction, industry codes, statutory regulation, legal liability, and parliamentary oversight are no hazard to free speech. On the contrary—as information technologies have become ever more powerful, in democracies we demand that they uphold minimum standards precisely to protect free speech from powerful corporate and political interests.
The advances and possibilities of the networked world will always excite and will hopefully, in time, answer some of society’s greatest needs—but these companies occupy a legal space on a false premise, giving them a commercial advantage based on their ability to publish with impunity. That in turn undermines other media, threatens plurality and increasingly contributes to an insupportable cultural environment fuelled by a business model that trades attention for advertising revenue.
Sean Parker, co-founder of Facebook, said that when setting up Facebook the question on the table was:
“'How do we consume as much of your time and conscious attention as possible?”.
The answer was that,
“we … give you a little dopamine hit every once in a while, because someone liked or commented on a photo … to get you to contribute more content … It’s a social-validation feedback loop … exploiting a vulnerability in human psychology”.
The hermetic spiral of content ends in ever more polarised views as users become blind to other perspectives, denuding us of a common space. The result is the abuse of public figures and the spread of bullying, hate and misogynist content at unparalleled levels. The ad revenue model fails to compensate content creators adequately and we have seen the wholesale collapse of other creative industries, the long-term cultural costs of which we have yet to calculate.
In the battle for our attention we have seen the weaponisation of information to political ends. While nothing new in itself, the commoditisation of political narratives and the lack of accountability has promoted a surge of fake news, locally and internationally funded, and with it comes a democratic deficit. This was frighteningly illustrated by the outcome of a Channel 4 survey last year in which fewer than 4% of people were able to correctly identify false news stories from true. The cost goes beyond the cultural and political. Our attention is secured by an eye-watering regime of data collection and with it a disturbing invasion of privacy and free will. The insights and potential for social and political control enabled by unfettered data profiling without redress or oversight undermine our human rights, our rights as citizens and the need for privacy in which to determine who we are as people.
The appropriation of our personal data is predicated on the use of intellectual property law. The very same companies that rigorously avoid editorial standards and regulatory responsibilities for content are happy to employ the protection of terms and conditions running to hundreds of pages that protect their commercial interests. The cherry picking of regulatory structures is at best hypocritical. Lionel Barber, editor of the FT, suggests that we “drop the pretence”. In a soon to be published paper from a group of industry insiders comes the suggestion of a new status of “online content provider”, with an accompanying online responsibility Bill and a new regulator. But perhaps, just as the arrival of networked computers led to a new legal status of “safe harbor”, the arrival of networked tech conglomerates requires an entirely new definition, based on the interrelation of society and technology.
Because, while big tech has yet to wake up to the societal responsibilities of its current businesses, the rest of us are hurtling towards Web 3.0: a fully networked world of smart homes and smart cities that will see the big five companies—seven if we include China—monopolise whole sectors and particular technologies, controlling both demand and supply, mediating all our behaviours and yet remaining beyond the jurisdiction of Governments.
We must never forget the extraordinary potential and social good in the technologies already invented and in use and in those still emerging, including publishing at a grand scale. However, while the internet is young, it is no longer young enough to be exempt from its adult responsibilities. This is no longer an industry in need of protection while it incubates. These are the most powerful companies in the world.
In finishing, I ask the Minister to tell the House whether the scope of the Government’s digital charter will include a review of the legal status of online service providers and an ethical framework for content. Perhaps he will also say whether he agrees with me that the same standards and responsibilities should apply to the media activities of online service providers in parity with other media players. Finally, what steps are the Government taking to create an international consensus for a global governance strategy for online service providers? I beg to move.
My Lords, I may sound like a long-playing record, but in this debate we have just a few minutes to spare on timings. I ask that every Back-Bench speech concludes as the clock reaches four minutes, as otherwise the wind-up speeches may have to be shortened.
My Lords, I thank the noble Baroness, Lady Kidron, for creating this opportunity for a hugely important debate. It seems to be my lot to follow her in a number of debates where she speaks so eloquently and passionately about the importance of protecting the vulnerable in the digital age. Since she has set out so clearly the challenges and problems of civilising the digital space, I will start by reminding us all of the opportunities and the value. Social media is something that the vast majority of the world now loves. We, our children and grandchildren and our parents and grandparents in various different ways all use it for good reason—because it adds real value to our lives. We and they genuinely enjoy the privilege of being able to communicate directly with whoever we want without any intermediation. As we think about the downsides, it is important that we put it in the context of the upsides and the huge possibility and opportunity that social media gives us all.
The technology itself really is morally neutral. It is what we as human beings do with it and how we configure it that drives the good and the bad. Clearly, we have to face into the bad. I am troubled by the trade-off of choosing between platform and publisher. I worry that we are ascribing old-world, analogue labels to a new-world digital phenomenon. It is akin to looking back 100 years and asking, is the car a bicycle or a train? It is neither and both. Instead of trying to look for an old world analogy, we have to really get into the detail of the new-world risks and opportunities, otherwise we just polarise the debate. I do not think it is a surprise that the biggest proponents of the publisher analogy are old-world publishers themselves, or that the biggest proponents of the platform analogy are the new media companies themselves. Methinks both of them have vested interests in this debate and we need to get into the detail properly of what is the potential and actual real harm that is happening in this new digital space.
For social media companies, this needs to be much more than fine words. I often think the sole job of the big social media companies, ably represented by very talented people in the UK, is to say no politely to every real request for change. At best, we get fine words and some money donated to education campaigns. What we do not get is what the social responsibility of a social media company ought to be, which is to roll up its sleeves and dedicate its really scarce resource, which is the engineers that develop the technology, to configure so that we can have the good and mitigate the bad. It will require genuine changes to the technology to have both, rather than just to polarise the debate. That is how we will tackle illegal extremist content, fake news, child protection and the protection of intellectual property rights in the space—by real technology changes.
The social media businesses, as the noble Baroness, Lady Kidron, has set out—the biggest, most profitable, arguably most successful companies of this millennium—have the resources and need to start putting them to work on these subjects. If they do not, we need to be willing and able to legislate to make them. It is a responsibility on us as legislators, and for government itself, to make sure that we get enough into detail that we are not ourselves conned into the Punch and Judy show of publisher versus platform but instead get into the detail of what can practically be done to ensure that we lean into the benefits of the new technology but protect the vulnerable and protect some of the most important things in our society, our very democracy and our freedoms, as a result.
My Lords, I thank the noble Baroness, Lady Kidron, for making this debate possible. She, along with the noble Baronesses, Lady Benjamin and Lady Harding, and others have performed off-stage heroics recently on behalf of young people in this country. They do it by continually looking to the future, rather than harking back to “Muffin The Mule” and the golden age of “Blue Peter”.
I teach media studies weekly in a number of universities in this country and overseas and I recently made a slide for my students that simply states, “Fake news is a wrecking ball”. I go on to explain that democracy is a fragile concept, and that, just like a wrecking ball, fake news once released is blind to the destruction it causes: destruction to facts, to complexity, to reputations and therefore to that most valuable commodity of all, trust—or, as the noble Baroness, Lady O’Neill, has taught me, to trustworthiness.
The Times columnist Hugo Rifkind set off something of a firestorm just before Christmas when he rightly called out the complicity—albeit possibly careless—of Twitter, Facebook and others in the enabling role they played in the distortion of news through social media prior to both the EU referendum and last year’s general election. I very much liked his closing line:
“Our political class needs to stop rolling their eyes and start paying attention. If the facts don’t move them to care, maybe the humiliation will”.
I believe that humiliation is about to break over us in the form of ever more conclusive evidence of the degree of pernicious activity that went into achieving Vladimir Putin’s wet dream: Britain’s detachment from the European Union, and I stress “Union”.
Last year the Data Protection Commissioner, Elizabeth Denham, started a formal investigation into the use of data analytics for political purposes. She said this recently:
“It’s a complex and far-reaching investigation, involving over 30 organisations including political parties and campaigns, data companies and social media platforms … A number of organisations have freely co-operated with us, answered our questions and engaged with the investigation. But, others are making it difficult. In some instances we have been unable to obtain the specific details of work that contributed to the Referendum campaign and I will be using every available legal tool and working with authorities overseas to seek answers on behalf of UK citizens”.
She said that she had been,
“forced … to invoke our statutory powers to make formal demands for information”.
That is pretty serious stuff.
This is made all the more serious when we consider the outcome of research recently conducted by the University of Stanford Graduate School of Education, which revealed that 80% of middle-school students could not distinguish between real news and content paid for by an advertiser. Sam Wineberg, the author of that report, said at the time of its release:
“Many people assume that because young people are fluent in social media they are equally perceptive about what they find there. Our work shows the opposite to be true”.
This is not helped by the fact that in 2015, 64% of nine year-olds in the US were found not to read at or above proficiency levels. Before any complacency is allowed to set in, it is worth noting that our figures are not much better. So, what do we do about it?
Time does not allow me to go into too many possible solutions, although some exist. I recommend that anyone interested checks out the website of a five year-old organisation named Newsela, which seeks to address this problem in the classroom by using news content as a teaching tool. It has licensing arrangements with, among others, the Associated Press, the Washington Post and Bloomberg. Encouraging the use of the best and most reliable news content as a classroom resource at primary level could, in my judgment, do very little harm and possibly a great deal of good.
Finally, there are a number of other ways in which trust could be developed and the worst impacts of social media reduced, but to do that will require the wholehearted attention of the Government in general and the Department for Education in particular. This is not simply a problem that needs to be addressed. It is a very real present-day crisis and one that deserves to be taken far more seriously.
Until the last decade, media platforms were pretty much locked into a one-size-fits-all broadcast model. Success with advertisers depended on producing content that would appeal to the widest possible audience. The recent development of tablet and smartphone technology has been the game-changer, creating a delivery system available pretty much everywhere, 24 hours a day, along with highly personalised and segmented channels.
We are in a wonderful new world of information, education and communication but, as we have heard, there are also serious downsides that we have to address. In a powerful article in this month’s Washington Monthly an early investor in Facebook, Roger McNamee, describes how the algorithms created by Facebook analyse your responses to what you see and then give you more of the same. He argues that negative and hostile messages provoke the strongest responses and demonstrates how these have been used in the referendum campaign here, as well as in the French, German and US elections. Tristan Harris, formerly of Google, has talked about the public health threat from social networks such as Facebook. He calls it “brain hacking”.
We are legislators and we like to legislate: if you have a hammer, all problems tend to look like nails. Widespread, piecemeal legislative change is not the whole answer here. We need to ensure that our education system builds in an awareness of issues such as privacy and safety online, harassment and bullying, as well as critical analysis of the news. The major platforms must do more to create fake news warnings. Education about how data is used could create more pressure from users for transparency about how their data is used. I do not think most users of social media recognise that they are not customers; they are the product. The terms on which users engage—the permissions—should be rebalanced in their favour. Ideally data should belong to the users, not the platforms, and its use should certainly be time-limited.
There are some signs that things are beginning to change. An article in this week’s Politico notes that,
“a growing number of internet users are turning to new applications and tools that prevent companies and governments from building up a profile of them”.
This is in its infancy and mostly in the business sector but I believe that more will emerge. Education needs to extend beyond school and should definitely include legislators. We—I include myself in this—are not sufficiently well equipped to make judgments in this area. In New York, a city council member called James Vacca promoted a Bill to provide greater transparency of the algorithms now used to determine how public services are allocated. He has recognised that transparency in this area is a key to modern political accountability.
There is also the issue of net neutrality, currently provided for by the EU regulation on open internet access. This means that ISPs cannot block or slow down data for competitive or commercial purposes. Post Brexit, we need to ensure that companies selling content and services are not able to reduce consumer choice by abusing that position.
To end on a positive note, Reuters business news carried a story on Tuesday about how some investors in high tech are becoming increasingly concerned about the addictive aspect of their activities and their impact on children. They are changing their investment patterns accordingly. Pressure on institutional investors to pressurise the digital giants could have a significant impact. That would be especially true for the huge public sector pensions around the world.
As I left to come over for this debate, I received an invitation from the Westminster Abbey Institute to a talk about truth in politics and the ethics of negative political messaging in social media. Perhaps we should all go.
I declare an interest as a series producer at ITN Productions. I too thank my noble friend Lady Kidron for securing this timely debate.
I am concerned by the rapid decline of quality journalism in this country and across the western world. The way social media platforms operate means that factually based journalism is under attack as never before. That is bad enough, but the way that news and views are disseminated on these platforms is creating an echo chamber. It excludes diverse voices and exaggerates the opinions people already hold: this is the filter bubble. I fear that it causes increasing political polarisation, which we see across the western world—a popularism fed by social media, where emotion triumphs and reasoned discourse is defeated.
The problem lies in the failure of social media platforms such as Facebook to value the quality of the content their users are viewing. Their main concern is the number of eyeballs and the length of time they engage with the platform, so that they are exposed to the adverts that almost entirely finance these companies. As these platforms become the dominant medium by which a whole new generation receives its news, this must be of the greatest concern to noble Lords. Of course, listening to your friends, or sympathetic points of view, is what we all do and what humans have always done. We want to rely on people we know and trust. But people inevitably pass on information that amuses or shocks them, rather than wholesome pieces of impartial information.
This has been exacerbated in the case of Facebook by the changes it has made to its algorithms. In June 2016, one such change meant that the “likes” of friends and families superseded the users’ own preferred pages, the aim being to engage the user ever more deeply with others on the platform and keep them viewing for as long as possible. However, it is very difficult for independent researchers outside the platforms to find out about these changes in algorithms and their effects. The information is closely guarded by the social media companies and all research is carried out in-house with an inevitable conflict of interest, which discredits the findings.
Independent research has been carried out into the way users consume news on Facebook. Dr Shan Wang of Harvard University found that half the people surveyed saw no news in their first 10 posts, and that was a very loose term for news: it included celebrity gossip and sports news. Only 1% of the users had news stories as their majority content. Of the news that they received, more than half came from friends and only 4% directly from the publishers.
It is a far cry from what is now called the “legacy media”, or the quality newspapers in which specialist journalists curate content. Some of this quality journalism does indeed appear on the platforms, but it is taken out of context and is just another piece of disconnected information among a raucous raft of considerably less reliable sources. Facebook’s “instant article” is a method by which the platform exposes users to a range of news outlets. However, it is hard to discern the provenance of the information. High-quality publishers are placed alongside websites peddling rumour and lies from some very dubious sources. Not only is much high-quality journalism suffering as newspapers’ advertising revenue reduces as it transfers to the platforms; the impact of such content is being dramatically diminished.
I am also concerned about the effect of user preferences on the role of British broadcast journalism in this environment. The temptation must be to loosen the constraints of impartiality in a world where opinion is king, but I would argue that Ofcom and the BBC must hold fast so that impartial content can be shared and passed on to friends and family. These sources of trusted and fact-checked information underpin our democracy and secure its future. Recent research by the Reuters Institute for the Study of Journalism at the University of Oxford shows that impartial broadcast news in this country creates trust and acts as a bulwark against polarisation. It compares very favourably with the lack of trust Americans have in their media, which has no place for impartiality and is driven by editorial bias.
The 2017 Conservative Party manifesto declared:
“We will be consistent in our approach to regulation of online and offline media.”
We must ensure that the filter bubble does not cut the people of this country off from diverse news, opposing views and even opinions that might offend them. If social media platforms do not take more responsibility for their content, alter their algorithms accordingly and go much further in curating their content, I fear some kind of third-party regulation will be required to intervene in the closed world of social media platforms.
My Lords, I thank the noble Baroness, Lady Kidron, for obtaining this debate. I, too, thank her for her tireless work in this area.
Social media and online platforms now play an enormous role in shaping national dialogue and accepted social standards. In my visits to primary schools and secondary schools in the diocese of Gloucester, I have spent time talking with children about social media, and I affirm all that is good. Yet, as children progress to secondary school, their view of themselves and the world is increasingly being shaped by social media and online platforms. Young people are receiving strong messages about worth being about looking a certain way and about success being measured in online likes. Furthermore, their fears about the world they are growing up in are being fuelled by what they read online.
The content we consume shapes how we see ourselves, other people and the world. It is no longer sufficient for social media and online platforms to cling to a simple dichotomy of platform versus publisher in order to escape responsibility for the content they promote and share. While previous generations’ engagement with media might have been limited to print media and television broadcasts regulated by formal standards and watersheds, modern consumers, including children, are exposed to huge swathes of unregulated content. Research conducted by the UK Safer Internet Centre in 2016 found that more than 80% of the teenagers surveyed had seen or heard online hate about a specific group.
While we would be foolish to think that we can legislate for human relationships, we have a responsibility as parliamentarians to create legislation that protects the vulnerable and promotes the kind of society we desire to live in, particularly given problems we have been hearing about around hate speech, fake news and extremist content online. The Communications Committee of this House last year produced an extremely valuable report Growing Up with the Internet, and I hope that the Government will take heed of the need for there to be a requirement for firms proactively to develop software to identify and remove harmful content as well as to ensure that design is child-friendly with default settings to protect children’s privacy and safety. I am very disappointed that the Government have chosen not to accept the recommendation of an independent children’s digital champion. I would like to know how there will be effective accountability without that.
I would also like to draw attention to the recent review Intimidation in Public Life published by the Committee on Standards in Public Life. It has a number of helpful suggestions, and I hope that the Minister can assure us that the Government will accept them, in particular, a requirement for social media companies and online platforms to publish quarterly reports on their progress on removing reported content. It would also be good to know from the Minister whether there are plans to monitor the implementation of the legislation in Germany that will fine firms which are insufficiently quick to remove illegal content identified by users.
Alongside the need for such legislation, it is good to know that the BCS, the chartered institute for IT, is endeavouring to facilitate solutions-focused dialogue between social media companies and political parties. I understand that Twitter and Facebook have opted in, and I hope that there will be participation across all political parties. I also hope such dialogue will contribute to legislation rather than being a completely alternative path.
I am grateful for today’s debate and sincerely hope that the outcome of our discussions will have a real impact on how we relate to one another online.
My Lords, I add my congratulations to my noble friend Lady Kidron on securing this important and timely debate. The social media industry is evolving very quickly and, as we have heard already, reality has overtaken the traditional ways of looking at news and publishing.
The large social media companies have become an important source of news for many people—indeed, for younger people, it seems they have already become the main source of news. A small handful of social media companies now have a dominant position and are driving advertising away from traditional news outlets. This dominance has been strengthened further as a result of consolidation amongst the big players, such as Google’s acquisition of YouTube or Facebook’s acquisitions of Instagram and WhatsApp. Indeed, Google and Facebook now command a level of dominance over the media industry and advertising revenues that Rupert Murdoch could only ever dream about—Facebook has more than 2 billion active users.
Increasingly, these social media companies are actually determining what news we see. Whether this is purely by algorithm or by human intervention makes no difference—they are still choosing the stories that we read. We have to question the extent to which advertising, both overt and covert, influences what the social media companies show us. Sensationalist “fake news” stories generate more hits and therefore more advertising revenues. There is little commercial incentive for these largely unregulated companies to police this, and the record of them doing so, so far, is very poor. That said, there are some welcome signs that the big players are starting to understand that they have responsibilities. The arguments made by social media that they are simply technology companies with no responsibility for what happens on their platforms are looking increasingly threadbare. Some regulation is, unfortunately, now necessary.
However, there is a spectrum here. Should a closed family WhatsApp group be regulated in the same way as a curated newsfeed? Should a small specialist chatroom, run by enthusiasts, discussing, say, hockey, be regulated as a newspaper? I would suggest not. We need to find a balance. Regulation as news and content publishers only solves part of the problem. For example, filter bubbles exist just as strongly in traditional media: many people read just one newspaper of a particular political colour. Education to encourage young people to question what they are reading is therefore, I believe, of fundamental importance. Social media can be a force for good here and provide access to a greater variety of sources, if done properly.
A key question from my point of view is why many people seem willing to behave online in a way that they would never do to people’s faces—bullying, hate speech, trolling, even death threats. I suspect that this may be due, at least in part, to the culture of anonymity that pervades social media. I do not have an easy answer to that. There is a strong argument that anonymity is important for freedom of speech, particularly in situations where dissent is dangerous. But it seems to me that, in addition to recognising the reality that the social media giants have become, in part, publishing companies, we also need to look very closely at the question of online anonymity.
My Lords, I join others in thanking the noble Baroness for securing this very important and timely debate. As it impacts on the media, I declare my interest as executive director of the Telegraph Media Group.
The digital revolution has been the most extraordinary transformation in the way that knowledge and news are transmitted since the arrival of the printing press in the 15th century, when information housed for centuries in manuscripts in monasteries first became publicly available. The result was an eruption in learning from past texts, theological ideas and astronomical theories that changed the world. Now we have a second shift in knowledge, which is just as powerful as that, with an explosion both in the production of content and in global access to that content, which has changed out of all recognition the way we communicate, do business, assimilate news, and indeed think and act. The next revolution, in artificial intelligence, will mark a further fundamental shift in the use of that information. The consequences of that are impossible to predict.
I welcome all that because anything which spreads knowledge across the globe is a good thing, but we have to be clear at the same time about the consequences for our society and our democracy. The giants that have powered this—Google and Facebook—are barely two decades old. They started off, as the noble Baroness said, as tech companies and, quite rightly in my view, were able to undertake the early exploration of the potential of the internet and the digital world largely free of regulation or legal restraint. But the world has changed fundamentally in those 20 years and these companies have in effect become public utilities. As we heard just now, Facebook’s active users now number over a quarter of the world’s population. For the commercial media, and the existence of an independent press, that has profound consequences because of the migration in advertising spend. As the noble Baroness also pointed to earlier, ad revenues have shifted dramatically online. It is estimated that, by 2020, more than 70% of all advertising spend will be with just Google and Facebook, with programmatic advertising fuelling fake news sites and other harmful content.
Ironically, the best antidote to the problems we have encountered with fake news is a free and independent media, which must remain the custodian of democratic debate and scrutiny. News media publishers therefore have a vital role to play in online content creation—indeed they are already the biggest investors in it. Nearly 60% of investment in UK original news content comes from newsbrands, and publishers now invest at least £100 million in digital services. But the companies benefiting from that investment are of course the global tech giants which rely on content from newsbrands to power their services. Content from UK news brands drives around a billion social media interactions a year, and eight of the top 10 most shared UK websites on social media were UK news media sites. As the New Statesman succinctly put it recently,
“most media organisations are now tenant farmers on Facebook’s estate”.
So we have the irony that the advertising revenues that fund the trusted news that people want are diminishing rapidly and its providers are heavily regulated, while the platforms and the social media that feed on them are almost wholly unregulated and growing exponentially. That disparity will be made much worse as a result of the amendments to the Data Protection Bill that the House passed last night—I had to say that just for the sake of the noble Lord, Lord McNally. Not least as a result of that, the stage is set for the growth of fake news here, fuelled by advertising supply chains described recently by Marc Pritchard of Proctor & Gamble as “murky at best”.
In conclusion, technology has changed the face of the world and, in spreading knowledge and information, has been a source of great good, but with power comes responsibility, and it is surely the responsibility of all those involved in regulation and lawmaking both to ensure the financial sustainability and independence of free media producing real news in this country and to tackle the issues of liability for illegal content, enforcing copyright and defamation and ensuring the fitness and transparency of the advertising supply chain that will ensure that real, verified news continues to thrive as part of a diverse and vigorous digital environment.
My Lords, the noble Baroness, Lady Kidron, made her case brilliantly, and I join others in saluting the leadership that she is showing the House by raising these issues and getting action in the Data Protection Bill. Her vision is exactly the opposite of that we see in the United States from the FCC in ending net neutrality, which is taking that country into such a dark place in this context.
I remind the House of my interests in respect of my employment at TES Global, which is both a publisher of what used to be known as the Times Educational Supplement and the provider of a substantial platform for teachers to find jobs and share teaching resources, so I find myself on both sides of the false argument, as described by the noble Baroness, Lady Harding. Indeed, I am also an avid user of social media. On the way in here, LinkedIn told me to congratulate one Ray Collins on seven years working at the House of Lords, which I am of course happy to do.
Social media and user-generated content are here to stay. I do not believe it is possible to pre-moderate all the content shared all the time. If we were to ask social media companies to do so, it would be an extraordinary barrier to entry for anyone wanting to create competition in this space, which I think we would want if we want the sort of new tools and platforms described by the noble Baroness, Lady Scott. But I do not disagree that we need to do much better, especially on content accessible by children. It needs new policy thinking and a regulatory solution that respects the consumers’ desire to share digital content and their need for trusted content and providers. My view is that tech companies are media companies, but that does not mean that the regulatory regime for traditional media is appropriate or in the public interest. Like the noble Baroness, Lady Harding, I think we need a new regulatory regime for online service providers.
The media need to keep evolving their business model, and will need new models of regulation, away from monetising content to generating traffic and data for other purposes within the business, in a new environment regulated by the GDPR. I also think that it is in the interests of the likes of Facebook to ensure that advertising revenue is fairly shared with the media companies whose content is widely shared for free on their platforms. I am told that they are having those conversations; I hope that we will get concrete action.
I also echo the point made by the noble Baroness, Lady Harding, about engineering time. Perhaps Google should extend its famous 20% time to a percentage, let us say even 10%, of its engineers’ time to help solve some of these problems.
The issue of fake news is, of course, ringing large in our ears. I am interested in how France’s President Macron is suggesting regulation of social media platforms during election periods. Perhaps we could restrict sharing to Ofcom-regulated news outlets, I do not know. We will have to see how that and the German experiment work.
Fundamentally, I would love to have a counterpoint on social media to my echo chamber. If I could press a button and see that, it would help my sense of what is going on on the other side. We need accountability, not always up to regulators but sometimes down to users. We will get some thanks to the Data Protection Bill. There is the right to explain and some data mobility measures provide accountability. I am also interested in data trusts and the politics of data. Perhaps we will end up needing collective action, the equivalent of a new digital trade union movement for platform users so that we can impose some data ethics and withdraw our data from the companies that are so hungry for it unless they give us the ethical safeguards that we need.
Finally, I echo what my noble friend Lord Puttnam said about more study of this in schools and in wider society. Then, perhaps, we can have an informed debate to find an imaginative policy solution to these pressing issues.
My Lords, I too congratulate the noble Baroness, Lady Kidron, on her sterling work and on securing this important debate. I declare an interest as vice-president of the charity Barnardo’s, which in 2004 produced the first ever publication in the UK to address growing concern about the ways in which children and young people may be at risk of harm online. Just One Click outlined the ways in which children were sexually exploited using the internet and mobile phones. Some were forced to pose for abusive photographs. Others were subject to sexual assault broadcast live via pay-per-view websites.
Barnardo’s 2015 report Digital Dangers recommended that there needs to be an assessment of products, such as games and apps—both those currently in use and those in development—to ensure there are safeguards in place to prevent children being harmed. This should include manufacturers providing evidence that every effort has been made to ensure that children are safeguarded.
Each day, every nine minutes, a web page shows a child being sexually abused. To combat this harrowing crime, countries need to work together on an international level. I appreciate the challenges that social media companies face daily to monitor content. On an average day Facebook has 1 billion users sharing photos, live videos and messages in a vast variety of languages. However, in 2017 it came to light, through a BBC investigation, that Facebook failed to remove up to 80% of images that were reported by users as containing sexual images of children. By failing to adequately address harmful content, or put in place effective mechanisms of reporting, corporations are blatantly avoiding the moral responsibility to protect those vulnerable children and young people in our society.
Last month, an investigation by the Times revealed that child sex abuse images continue to be published on YouTube. I hope the Minister will be able to tell the House how this material can continue to be available in the light of the role of the Internet Watch Foundation? I hope he will also set out how the accessibility of this type of material will be affected by a number of new policy initiatives.
First, will YouTube come within the scope of the social media code of practice that is proposed as part of the internet safety strategy and, if so, how will the code constrain similar content? Secondly, will the children’s age-appropriate design code, which will be introduced after the Data Protection Bill becomes law, reduce the amount of this sort of material on YouTube—a site that is so popular with many young people? Thirdly, how will the BBFC tackle this sort of material in its role as age-verification regulator under the Digital Economy Act 2017?
In 2016 and 2017, I raised the question of how social media and media sites would be treated under the Digital Economy Act. In her evidence in 2016 to the Lords Select Communications Committee, on which I sit, the noble Baroness, Lady Shields, stated:
“Twitter is a user-generated uploading-content site. If there is pornography on Twitter, it will be considered covered under ancillary services”.
We know that a vast amount of pornography shown on media sites is user-generated. I should be grateful if the Minister would update the House on the remit of the regulator in relation to user-generated content on YouTube channels and other social media. How is it coming along?
The Government have made a good start, first with the Digital Economy Act 2017, publishing the Internet Safety Strategy Green Paper, and working with the tech industry. However, for us to become the safest place in the world for children and young people online, we need to create a culture where social media and online platforms act ethically and feel a sense of social responsibility, integrity and morality when creating, maintaining and updating platforms, and are subsequently held to account. For the sake of the future, let us put children’s well-being first.
My Lords, I thank my noble friend Lady Kidron for tabling the debate this afternoon and, in doing so, I declare my interest as a board member of the BBC.
Social media has transformed many disabled people’s lives. It has allowed new and news media to flourish and encouraged sharing of information. I would not want to go back to a time before social media and the internet. I remember being in the USA when the worst of the Rwanda genocide was happening and knew nothing about it because it was not covered anywhere.
I have spoken previously about how information is pushed through algorithms that try to second-guess preferences. While that may be valuable for advertising, we need to be reminded that it gives us a different, and while potentially increased perceived, choice, in reality it is far less than that.
I use social media quite a lot, and I have had many positive experiences. Sitting very late one night in your Lordships’ Chamber, I tweeted that I had not had anything to eat and within minutes had had several offers of pizza at the Peers’ Entrance. At 12.14 pm today I found out that there was a possibility of a joint Korean team competing at the Winter Olympics in hockey; at 1 pm I found out about a young disabled woman who has had her speech machine stolen and cannot communicate with her family.
However, I wish to talk about a very personal experience of social media. On Christmas Eve, I posted a moment in time. Ultimately, it was not going to change my life, but I could not get on a train. It was annoying and a bit irritating, as every other non-disabled person who was on the platform was able to get on, as they had the two previous trains. I did not think it was a news story but apparently it was. It showed how little control I had over something that affected me. Within minutes it was on news sites and I was taking calls from local and national newspapers. It received 320,000 impressions, 511 direct responses, 1,522 retweets and 1,360 likes. When I tried to rationalise it, I thought my post had raised an issue that affected millions of disabled people and helped others to articulate the experiences that they had. Of course, it brought out the trolls; the best that I can repeat is that people like me should not be allowed out. But this was something that I had spent less than a minute contemplating posting—possibly a valuable lesson for us all.
While propaganda may always have existed, it is now about the speed at which “news”—I say that in quotes—travels, and the responsibility that comes with it. It is not the same as a traditional news outlet and we need to think very differently about how it is regulated. I have concerns about the proliferation of these sites. What one person calls news, for another person is chip paper. You choose to follow a view because you agree or do not agree with it. I support difference of opinion; we need to be challenged to get the best out of the decisions that we make.
There is a great deal of positivity, and there are very responsible outlets that work hard to educate. My daughter is 15, and at her school they educate pupils through delivery of the EPQ skills lessons, with sessions on data literacy. They are taught to have a healthy scepticism about statistics; there are separate sessions on evaluating sources, both on and offline, and they cover fake news. So education gives you a choice. I understand that the Government may not want regulation; there is an element of Big Brother to that. Does the Minister agree, however, that social media companies should take much greater responsibility for the content distributed through their platforms?
Finally, while I recognise the positivity of social media, with the speed of development of platforms and technology we need to be much more mindful of what failure of self-regulation may look like and remain ahead of the curve, because dialling back is just too awful to contemplate.
My Lords, I thank the noble Baroness, Lady Kidron, for her timely and important debate. I sit on the Select Committee on Political Polling and Digital Media. Over the course of many hours of taking evidence has come the growing and uncomfortable realisation that one of the key sources from which we gather our news may be open to manipulation. Where once we gathered our news from a trusted newspaper or broadcaster, we now have an infinite number of sources at our fingertips to navigate at high speed every day. We are less certain of our sources; our judgment of the content is clouded by the lack of context—and as for the content itself, this is a whole new world. It is a far cry from the regulated broadcasters, or even our self-regulated newspapers. This is a world where no one is made to feel responsible or accountable for what is said, least of all the media giants that provide the platforms. They have huge power with no responsibility.
Social media are sometimes a force for good and for necessary change, but often not. So we enjoy one of the greatest revolutions of our age, but with it comes the inside of Pandora’s box—a generation of young people who have to grow up under the constant pressure of social media, which must be in part responsible for a near epidemic of mental health issues. Online platforms offer all too easy access to indecent images of children and of terrorism. There are growing concerns about how our democracy is being undermined and manipulated, and concerns around so-called fake news and the lack of transparency in political advertising, as well as about what is real and what is a Russian bot. The media giants simply shrug their shoulders and say that it is not their problem, firm in their position that they provide the platform and the responsibility lies at the point of use. This is partly a matter of principle for them—a libertarian defence of freedom—and partly practical. How do you regulate something like the net, which is as shifting as a global desert? As the storm rages, we ask ourselves as we do today: what should be done? We can look at the question of whether the likes of Google and Facebook should be reclassified as publishers. I am yet to be convinced that this is the solution, for the reasons that many have given so well this afternoon, including my noble friend Lady Harding.
One thing is crystal clear: regulation in one form or another is coming to this sector whether it likes it or not. While we debate among ourselves the sort of regulation, social media giants should start trying a lot harder to solve some of these problems themselves. This falls well within their reach. Algorithms should be altered so that indecent images of all kinds are less easily accessed, if accessed at all. There should be more transparency around political advertising. Companies can do more to monitor the content on their sites and weed out the bots. It is simply not good enough to sit back and take no responsibility.
However, we must also be honest with ourselves that, while this battle is worth fighting, it is a battle that we will never entirely win. Ultimately, we must protect the integrity of our society and our democracy through the exercise of our own judgment, by learning to navigate the web from an early age and assessing the validity of what we read. This should be a partnership, with proactive social media and online platforms trying to fix the problems, regulation to address those which are not being fixed and vigilant citizens addressing those problems which might never be solved by either. Together we will never stamp out all the bad, but we are more likely to navigate away from it.
My Lords, I add my thanks to those of other noble Lords to my noble friend Lady Kidron for initiating this debate. I begin by declaring an interest as chair of the Committee on Standards in Public Life, which has a particular interest in this subject, as the role of social media took up a large part of the report that we produced before Christmas at the Prime Minister’s request on intimidation in public life.
For the committee’s review and publication of the report on intimidation in public life, we were interested in the role of social media companies in relation to illegal content, particularly threats of violence and illegal hate speech, such as racist abuse. Let me say straight away that we recognised that, in many respects, social media is a force for good and democratic expression and is a democratising force in our public life. It promotes in many important respects engagement with politics. None the less, the scale of the problem which confronted us disturbed us. We had to come to terms with the fact that the legislative framework governing the responsibility of social media platforms is based on the EU e-commerce directive of 2000, which was framed well before social media companies and online news platforms existed in their present form, when they were essentially fledgling bodies.
The e-commerce directive shields companies from liability for illegal content where they are simply “hosts” and where their relationship to content is “technical, automatic or passive”. This exemption from liability requires that the company does not have knowledge of the illegal content, and takes it down expeditiously if it becomes aware of it. This formed the basis for what is known as the “notice and takedown” model. Our committee took the view that it is no longer appropriate to see social media companies as mere platforms. These companies choose the format in which users can post content and they curate that content, using algorithms to analyse and select content, including for commercial benefit. This is well beyond the role of a passive host. But nor are they publishers which should be held fully responsible for all their content, because they do not approve every item that appears on their platform and they do not create the content themselves.
Our committee concluded from this that we need new categories and new ways of thinking about this problem that go beyond the platform/publisher distinction; that we need to think properly about the role and responsibility of social media companies; that there should certainly be a shift in the liability towards social media companies for illegal material; and that the Government should bring forward legislation so to do. I have to say, when the committee started work on this, this was not a conclusion that was in our minds, but it was a function of our many discussions during the period of work on that document.
This shift in liability could be for particular types of content, or could be based on how difficult or how expensive automatic monitoring or removal of types of content is. As my committee made clear in our report, to address intimidation will require all those in public life—this is broadly across the problem of intimidation—to come together and work constructively.
I am very grateful to the right reverend Prelate the Bishop of Gloucester for mentioning my next point. Our committee also agrees with her that the BCS, the Chartered Institute for IT, which is convening discussions between the social media companies and the political parties to think about solutions to online abuse and its effect on the democratic process, is doing valuable work. The Committee on Standards in Public Life fully supports that work.
My Lords, I too congratulate the noble Baroness, Lady Kidron, on bringing forward this debate. I declare my interests as set out in the register.
It is my firm view that it is almost invariably technology, not politicians—not us—which determines the character of the world in which we live. That is nowhere better illustrated than in the transmission of information, particularly through the development of digital technology. Looking back, let us consider how information has been transmitted. First, it was by word of mouth. Then writing was developed, which led to messages being moved backwards and forwards. Then there was printing, which made the messages much more widely available. That, in turn, was distributed ever more effectively, not least by the development of the railways. Then, in the 19th century, we saw the possibilities of transmitting voice messages through telegraph and radio. In the last century, moving images were transmitted through television and so on. We have seen the development of engineering, enabling reverse path messages to be easily available, payment systems, all kinds of point-to-multipoint messaging and the creation of a whole spider’s web of relationships. We have seen in all that the privatisation and democratisation of the process of transmitting information. Now we are in a world where it is relatively easy to create complicated images and then transmit them cheaply all around the globe. No doubt there is an awful lot that none of us can imagine which will happen in the near future. Therefore, it is hardly surprising that the topography of this landscape changes and, a decade on, I dare say that it will be very different from what we see now.
As has been mentioned, the key player for the last 400 years or so has been the concept of the publisher, because it is he who draws together all that has been going on. He inevitably, then, becomes the focus of the relationship between the information and the law, because, after all, authors are frequently men of straw. Of course, our current concern hinges around platforms: are they publishers or are they distributors? In our country, historically we have taken a very pragmatic and sensible view about distribution: the distributor is a mere conduit necessary for the dissemination of information to take place, particularly if he does not know what he is transmitting. What are platforms? Are they simply distributors or are they publishers? After all, they generate huge amounts of advertising and make all kinds of selection via algorithms. As my noble friend Lady Harding said, the old distinctions just do not seem to work anymore.
Many of the ills that have been pointed to in this debate could be dealt with in a domestic context if they fall within domestic jurisdiction. However, the internet and the big players do not; that is, after all, the point of the internet. They are more or less effectively footloose and fancy free across the net if they wish to be so, so the old-style general legal-based approach to general regulation is completely impractical. Unless we have a single global jurisdiction, it will not work. Rather, Governments must get together with these companies, which after all need Governments and the consumers as citizens to work out a responsible way of developing codes of conduct and modus operandi. This will not necessarily be straightforward, and I think means that Governments will have to treat these companies analogously to those other countries they deal with in intergovernmental negotiations. I do not suppose they will like that but it seems to me that is how it is. This has implications for democracy and all other kinds of parliamentary and legal behaviour.
At the end of the day, we must not forget the most important aspect of all, the principle of freedom of expression, for that is one of the fundamental attributes of liberty.
My Lords, I too warmly congratulate my noble friend Lady Kidron on this important debate and on the way she opened it.
I fully acknowledge the considerable benefits of social media and online platforms, but will home in on their role as publishers of online gambling opportunities and pornography and the risks this poses to 11-16 year-olds. A Gambling Commission report published just last month showed that in 2017, 3% of 11-16 year-olds spent their own money on online gambling. This is illegal and should have been prevented by robust age verification checks. I note that 3% also gambled in 2016—thus there was no improvement between 2016 and 2017—and therefore ask the Minister what the Government and the Gambling Commission are doing to address this in 2018. The report also shows that in 2017, 11% of 11 to 16 year-olds played gambling-style social games, which are often free to play and offer no cash prizes, with 73% of these played via apps on smartphones and tablets and 28% playing via Facebook.
Any thought that the industry might get its own house in order voluntarily seems highly questionable given that revelations in the Times that online gambling opportunities were being marketed with children’s cartoons in October have now been followed by revelations in the Guardian just after Christmas about the role of Scientific Games. The article stated:
“Scientific Games, a US firm that has provided FOBTs to Ladbrokes and casino games for several gambling websites, makes a variety of these ‘social games’ available as apps on Facebook. One of its apps features the children’s cartoon characters The Flintstones, while another is themed around the Rapunzel fairytale”.
It is of great concern to me that these apps are so easily accessible to children. The article quoted Mark Griffiths, a professor of behavioural addiction at Nottingham Trent University, saying that social games were the “number one risk factor” for children becoming problem gamblers, even if hosted on Facebook rather than a gambling site. Despite this fact, however, gambling games that do not involve money do not meet the definition of gambling in the 2005 Act, and they are not regulated by the Gambling Commission.
I understand that these games may be targeted at those legally able to gamble, but they clearly appeal to children. Indeed, the press release accompanying the Gambling Commission’s report last month said that,
“new technology is providing children with opportunities to experience gambling behaviours through products, such as free-to-play casino games, social media or within some computer games, which do not have the same level of protections or responsible gambling messages as regulated gambling products”.
In this context I ask the Minister, is it not now necessary to amend the Gambling Act 2005 to broaden the remit of the Gambling Commission to deal with gambling games when no money is involved?
Thirdly, social media acts as an advertising forum for gambling adverts for young people. The majority—70%—of 11 to 16 year-olds have seen gambling advertisements on social media, and one in 10—10%—of 11 to 16 year-olds follows gambling companies on social media such as Facebook, YouTube and Instagram, so are receiving positive messages about gambling via these sites.
Finally, I highlight the role of social media and online platforms in the publication of pornography which is illegal but does not meet the narrow definition of extreme pornography and is published on a social media site or online platforms, most of the content of which is non-pornographic. Last April the Times reported that non-photographic child sex abuse images were being published on Facebook without age verification. Just a couple of weeks ago, moreover, another Times article showed that paedophiles were using YouTube as a “shop window” to showcase abused children. On that point, when the age verification regulator starts its work in May, will it be able to take action against this illegal content, or will the fact that Facebook and YouTube are not primarily for conveying pornography make it powerless to act? The regulator is powerless to act in this situation, so what are the Government going to do to address this problem with social media and online platforms?
My Lords, I add my congratulations to the noble Baroness, Lady Kidron, on securing this excellent debate. As is always the case when one speaks late in such a rich debate, nearly everything that I would have said has already been said, and far more eloquently than I would have been able to do. Therefore, I will keep my comments brief.
I want to reiterate what many people have said already: that the current use of social media and internet platforms is an extraordinary new development and feature of our everyday life. The two-way nature of these platforms is extraordinary and unlike anything that society has ever witnessed before. We interact with information in ways that are often thought through but sometimes not enough time is spent in thinking about the consequences. We leave behind us a breadcrumb trail of information that can be gathered, harnessed, analysed and used for commercial profit.
As the noble Baroness, Lady Harding, pointed out, these platforms can also bring huge benefits. They are not popular for no reason. An enormous amount of effort has gone into making them useful, and they often have huge beneficial consequences for the way we live our lives. That said, there is clearly a need for us as a society to think about our laws and about whether they are keeping pace with the current use of these platforms.
There is a unique aspect to the anonymity afforded by many of the providers, as some previous speakers have pointed out. Hiding under the cover of anonymity, it is possible to spread hatred, propaganda and abuse. However, it is also true to say that human nature dictates that we find people with real identities far more persuasive, and that has led to a new trend in fake personalities or fake accounts being created. Such accounts can be created very easily.
These platforms have become phenomenally popular and have achieved extremely high levels of penetration because of the ease of their use. By their very nature, it is easy and quick to set up accounts, and a low level of checks and balances is applied. That means that it is now possible to create and control a relatively small number of accounts. Co-ordinated action then leads to stories being published, then republished, “liked” and followed, and they, in turn, are picked up by algorithms that have been designed to make our news feeds and information flows more relevant to our interests.
Therefore, an extraordinary facilitating action of these platforms enables fake news and other propaganda to achieve vast audiences. It is now possible for news to circumnavigate the globe almost instantly. In the time that it takes for something to be checked and then challenged, lives are ruined and reputations can be irreparably damaged. These platforms have also learned how to attract our attention and generate the advertising revenues on which they thrive. The more outrageous and enticing the headlines, the more likely it is that they will receive attention.
There are obviously many other aspects that require our serious attention, and I just want to think about some solutions. There is clearly a need for us to acknowledge that self-regulation is insufficient. The noble Baroness, Lady Kidron, gave the example of children being expected to flag content that they find distressing, but that is clearly not sufficient. It is very important that we now expend public money and effort on providing a neutral and independent set of checks and balances on this entire industry. That is not without precedent. There are many people who are able to devote themselves to creating the same sorts of tools that have been used to combat advertising fraud or maximise advertising revenue, and they can be brought together in the public good and be funded from the public purse to create a check and a balance against the needs of the industry.
Education and keeping ourselves informed is a huge responsibility that we all must bear. It is possible to move away from the household brands and find alternatives with similarly interesting names, such as the Epic browser, the Comodo Dragon browser and the DuckDuckGo browser, which enable you to have a very different experience of the web in which you are not leaving behind a hugely valuable data trail and can insulate yourself slightly from some of its worst aspects. The noble Lord, Lord Knight, talked about unionising users, which is an exciting idea that should be explored. With the great power we have in using these platforms, we should be able to exert pressure back on them.
My Lords, I too thank my noble friend Lady Kidron for introducing this timely and important debate. However, I feel such a hypocrite. I live on many screens: I tweet and I follow, and I delight in Amazon—no more trips to the shops. As for Google, where would I be without it? The only major social media I do not use is Facebook—there I draw the line. Truth be told, I love the products that these companies provide, and yet I am so critical of these very same companies. Yesterday in your Lordships’ House I spoke about protecting public data assets from big tech. Today, I want to speak about big tech’s lack of corporate and social responsibility.
The fact is that these companies—Apple, Amazon, Google, Twitter and Facebook—have become the colossuses of our 21st-century world. They stand astride our economies and our social interchanges. Their corporate power is, quite frankly, scary, and the influence of their products on society can be devastating—just look at the recent US elections. It was Google that coined the phrase, “Do no evil”, a mantra that could just as well be applied to any of the other participants. They believe that they exist for the benefit of mankind. My view is different. I am truly worried by the power these companies wield, but before I turn to social media, I should like to address the related issue of their universal obsession in avoiding paying tax.
Why is it that I talk about tax avoidance in a social media debate? Because big tech companies are driven not to pay tax, with the same fervour as they are driven not to take responsibility for the content that appears on their platforms. The same organisations that employ the brightest people in the world to design their products, enhance their systems and create their algorithms also employ the cleverest people on this planet to ensure that they pay little or no tax. Billions, perhaps even trillions, of dollars of untaxed corporate profits are squirrelled away in Luxembourg, the Cayman Islands and the like—the result of convoluted international structures set up with one purpose only. Maybe their mantra should be modified: “Do no evil, pay no taxes”.
This afternoon’s debate centres on the responsibilities of the social media companies, whether such companies are platforms or publishers, and how they should be regulated. For too long, YouTube, Twitter and Facebook have positioned themselves as platforms—conduits of data with no responsibility for their content. These days, few believe that. We continue to be shocked by the vile words and images that the world is able to access on these platforms. No newspaper, however extreme it may be, would ever dare publish the lies and images that we see on social media, but even now little is done to control these companies.
I am certain of one thing: if the social media companies and big tech really wanted to clean up their act, they could do it. The genius that created these amazing organisations and the accumulation of talent and resource that they now have at their hands is unparalleled in history. All of this could be harnessed to clean out their stables. They could become good corporate citizens. All they need is the will, or perhaps the legislative imperative, for without laws, it is clear that anything goes.
My Lords, I too thank the noble Baroness, Lady Kidron, for initiating this important debate.
I would like to draw the Minister’s attention to the increasingly important role of online content and social media and the Government’s forthcoming reforms to relationships and sex education. The Department for Education is currently conducting a call for evidence on this subject and new statutory guidance will focus on what should be taught in our classrooms to ensure that young people can navigate an increasingly complex world in relation to sex and healthy relationships.
As important as this undoubtedly is—I commend the Government for their efforts in this area—when new guidance is published by the department it should not ignore the role of online content and social media. In a recent opinion poll conducted by the Centre for Social Justice, almost 60% of young people said they are actively looking for relationship information and advice online. In the same survey, these young people told us that they are least likely to go to a teacher for information. New statutory guidance produced by the Government should bridge this gap.
The way young people receive information and go looking for it has changed and RSE provision needs to reflect this. A modern approach to relationship education cannot simply be delivered exclusively within the classroom. At a time when a typical 12 to 15-year old spends almost a day a week online and more than eight in 10 have access to a smartphone, the Government should consider how high-quality relationship information and advice could be delivered online as well as in the classroom. The previous Secretary of State for education talked about the need for a modern approach to RSE. This must include a high-quality online presence. In the same survey, 42% of older teenagers thought that there was not enough good-quality advice and information online. The Government should step in with their reforms to guidance in this area.
One charity taking a lead in this area is the Family Stability Network, which has launched the Status campaign to help young people understand what it means to be in a healthy relationship. Status responds to a growing demand for better relationship information delivered online and through social media, and believes in helping young people think through their relationships and build longer-lasting, healthier relationships through engaging and informative content. Status is promoted to young people across social media and has reached over 500,000 young people in the past year alone. I recommend that the Government look to Status as an example of what can be achieved in this area.
Perhaps I may make a direct appeal to the Minister. When the Secretary of State launches new guidance on relationships and sex education in schools, the Department for Education should also announce a new dedicated innovation fund, recommended by the Centre for Social Justice, to encourage the development of kitemarked online information for young people and parents.
Relationship and sex education needs to go beyond the classroom if it is to make an impact on young people, and the department has an important role in making it happen.
My Lords, my noble friend Lady Kidron has introduced a debate that is not just timely but urgent. It is different from earlier debates and discussions we have had, which focused largely on social media issues—although I know there has been a great deal of discussion of social media today—and that is not something one can take lightly. The wanton or malicious uses of digital technologies, particularly social media, can spread content that harms other individuals. The list is very long—cyberbullying, fraud, grooming, trolling, extreme pornography and endless sorts of breaches of privacy and confidentiality.
However, today I am going to focus not on harms that individuals may do to other individuals using these technologies but on ways in which digital technologies may spread content that harms public culture, and thereby civic and civilised life—and, ultimately, democracy itself.
I have time to mention only a few examples. First, there is the harm to electoral process and public debate. In this country we regulate expenditure on advertising by political parties during elections quite closely, but advertising by others and disseminating content that is not labelled as a political advertisement—whether by individuals, corporations or foreign states—is unregulated. This used not to be a problem. Such advertising was unlikely, it was costly and it could not effectively be provided from afar—but this has changed. Some noble Lords will remember the lurid and mendacious material that was “hosted”—in the pretty vocabulary that is used—online on websites run from Macedonian villages which were provided with particularly provocative and damning content during Mr Trump’s election campaign. Digital content can be algorithmically distributed without any indication of provenance and without any means of complaint, redress or correction over any distance and at very low cost compared with traditional advertising. The present situation makes a mockery of our tight regulation of party political expenditure on elections. The committee of the noble Lord, Lord Bew, might want to look at this one.
The second example concerns the debate about publishers and platforms. It is an important debate and we can all see why those who run online platforms are not always in a good position to exercise the responsibility of publishers. That is, as it were, their get-out card. However, what they are doing is hosting a large amount of anonymously posted content, resulting in irresponsibility at two levels: at the level of the platform and at the level of the individual who posts content. Is this acceptable? Well, people always invoke the argument of free speech, which we should take seriously. There may be a good case for protecting anonymous postings on matters of public interest under repressive regimes. That is a much-cited special case, but it is just that: a special case. There is no generic case for exempting from accountability those who post content anonymously or for protecting them if they damage, defame or discredit others, reveal personal information and the rest of it.
I think that democracy will fail if we find that when we talk about public affairs, what is going on is the equivalent of hiding behind hedges in order to throw stones more effectively. There is perhaps a case for holding online platforms responsible in the way that publishers are responsible, if not for all the content they carry then at least for any content posted without a verifiable indication of its source. That verifiable indication would mean that the individual carried liability, which would be better than the present situation.
The third example is that of monopoly providers, about which others have spoken. This is a serious issue because these are enormous companies and, given that they are digital intermediaries, they can shift jurisdictions very quickly. This has wide and deep effects on public culture. We need to think very hard about anti-monopoly provisions in this area.
My Lords, the manipulation of social media and control over differing value sets present regulatory and ethical challenges in today’s world. Manipulation can undermine political and social life, shaping Governments and governance, colouring decision-making and economic espionage. Thirty countries are said to use online tactics to manipulate outcomes, yet Governments currently have limited or no control over this environment. Identifying perpetrators with certainty is difficult. So what is to be done, by whom, moving forward?
The short answer is that the social media platforms could and should step up to the plate and publish their own analyses. Distinguishing one threat group from another is possible when sufficient information, analytical know-how and technology tools combine. Cyber intruders leave digital footprints with links that enable computer forensic analysts to separate one intrusion from another. Major platforms, most particularly Twitter and Facebook, retain the vital data to pinpoint state-sponsored accounts operating on their platforms—but they are not willing to share it. They say that their own systems work internally to find and shut down bot and misinformation accounts. But, whenever they delete an account or when the account holder deletes it, the information is lost, with trolls simply making new accounts and reviving the process.
Foreign influence does necessitate formulating a plan to counter interference. Should the Government reject calls for censorship and regulation, trigger a process to enshrine protection and penalties into domestic legislation and so rein in, control and protect through the rule of law?
What realistically could be done? We could devise support programmes of fact checking, verification and digital forensic initiatives capable of exposing falsehoods and false claims of authority that underpin fake and propaganda pieces, and ensure platforms crack down on automated amplification networks that impersonate humans—botnets. Social media networks could develop and administer algorithms for identifying and removing fake news by marshalling the same engines that spread fake news in the first place. They should identify repeat disinformation offenders and have them demoted, if not taken offline.
Government should also invest in media literacy and education programmes. Emotional targeting is the central tactic of disinformation. People have to be taught how to recognise it. When it is used as a direct tool of the state, we must expose it, not ban or censor it. We must work with the social platforms and civil society groups, not against them, to close such loopholes as anonymous accounts and the use of hyperpartisan rhetoric. Platforms should make verification necessary and easier. Traditional media must also responsibly verify the social media accounts they cite.
HMG can lead internationally by devising and promoting a new global treaty to nail this issue, and here at home by creating an independent commissioner with oversight, accountable to Parliament. Self-regulation is to be supported but scrutinised. There will always be loopholes, but signals from the major platforms are encouraging and consequently should be applauded. Co-operation, not complacency, must win the day.
My Lords, I am grateful to the noble Baroness for introducing this debate. I do not think any of us can claim that this is the most digitally aware workplace in the country. Indeed, when talking about Twitter with a colleague here the other day, he asked me how many followers he had. I had to explain that as he was not signed up and did not have a Twitter account, he did not actually have any followers. I do not think he is unrepresentative.
I cannot pretend to be a digital whizz myself, but I am on Twitter. Somehow, without knowing how, I have managed to set up my Twitter account to feed automatically to my Facebook page, which I am rather pleased with. I have just over 5,000 followers, which pales into insignificance compared to, say, the noble Lord, Lord Sugar, who, with nearly 5.5 million followers, understands the power of communicating directly and influencing a very large audience. With 500 million tweets posted every day and 1.33 billion people active daily on Twitter, the power of online platforms cannot be ignored.
I wish to focus my remarks on how this revolution is affecting public life. Social media has made communication with those of us in public life much easier. More than 70% of UK adults own a smartphone, which can be used from any location to send messages directly to the social media accounts of politicians and candidates. My interest, as noble Lords may be aware, is as chair of Women2Win, which encourages and supports female Conservative candidates to stand for election. A recent Fawcett Society survey of women in public life found that most women failed to report abuse as they did not think the platforms would act. This is wrong. They should take tough action against abusers.
I very much welcome the Committee on Standards in Public Life’s recent inquiry, which showed, among other things, that Conservative candidates, especially women, were more likely to be the subject of intimidatory behaviour than candidates representing other parties. This is worrying. It is hard enough to get women to stand for public office, and all barriers need to be addressed. If they are not we will be left with a political culture that does not reflect the society it should represent, with serious implications for our democracy.
Let me give your Lordships a real example—one of many. During the election campaign in June, the Ealing Central and Acton Conservative candidate was met daily outside her home by a large group of Momentum and Labour activists yelling at her, and I quote—and please, my Lords, forgive the unparliamentary language and block your ears if you are sensitive or easily offended—“Fucking Tory cunt”. This young woman has a young child. How can this be acceptable? How does this not deter other mothers from stepping up? Her activists and volunteers were routinely spat at. They told an Asian activist that she deserved to have her throat slit and to be in the ground for being a Conservative—and much, much more, especially on social media.
Standing for election and public office for whatever political party should be recognised and celebrated as a noble, honourable and responsible action to take. This abusive behaviour is fuelled by the anonymity which social media platforms provide. This is just one example of many where, during an attempt to take part in the democratic process, a candidate was subject to abuse, intimidation, libel and slander. Civil, criminal and electoral laws were broken, yet no action was taken. Online platforms have a responsibility to play their part in preventing this in future.
My Lords, I, too, thank my noble friend Lady Kidron for this debate. We joined the House together and I remember clearly her saying to me, “Oh, I really do not know anything about technology”. That is clearly untrue and I learn from my noble friend all the time. If the noble Lord, Lord Mitchell, is a hypocrite, I am afraid that I am Judas, as I must confess early on that I am a board member of Twitter—I shall come back to that in a second.
I was lucky enough to give the 2015 Dimbleby Lecture, in which I presented the case for believing that the Silicon Valley giants would come for a tumbling over the next few years, but even I could not imagine how quickly they would fall. My own small think-tank charity, Doteveryone, did some research that has been released this year showing that 63% of the UK’s adult population does not trust technology. Only one in five people believe that technology companies are doing something valuable with their data. More than 90% of people want to know what is being done with their data, and only 30% can find out what. These are staggering statistics, and it is important to put in the context of today’s debate that failing wider consumer and civil trust in technology, because it is corrosive. As we have heard it most eloquently said by many people around the House, technology is not going away.
Perhaps I may return to Twitter. I joined the board because I am an avid user—not quite with the 5 million followers of the noble Lord, Lord Sugar; my own small number is a fraction of that—and because, when I became UK digital champion in 2009, it immediately gave me a route to some of the local community groups working on aspects of digital inclusion that I knew nothing about. It enabled me also to tap into the biggest brains in the sector and build up my own small following of people who were interested in what was happening. I have learnt three things from being on the board that I would like to share with your Lordships today as they are very relevant to the debate.
First—and this perhaps is the most important—nuance, complexity and specificity of argument, policy decision and change are incredibly important. Twitter is not Facebook; Facebook is not Amazon; Amazon is not Google. Yes, they share many characteristics. On the point made by the noble Lord, Lord Mitchell, I wish that Twitter had even made a profit. I am sure that many of your Lordships in this Chamber would think that it had, but it has not. We have enormous reach—350 million users; we have fewer than 3,000 members of staff and, as yet, no profitable revenues. Google, as is well documented, has $70 billion on its balance sheet. As noble Lords may have seen from the front page of the Guardian today, Jeff Bezos is now the richest man in the world with $106 billion of wealth personally to his name, which could pay off the UK national debt twice. It is incredibly important if we are to make good decisions in this Chamber and beyond as users and citizens that we are specific in our discussions.
Secondly, I have learnt more than anything that diversity of thought and view is vital. I am surprised and happy that the noble Baroness, Lady Jenkin, remarked on parliamentary candidates’ roles on social media. We must fight for more equality of representation in all those companies at the most senior levels. I was the second woman to join the Twitter board. There are only two women on the board of Facebook; one is Sheryl Sandberg; there is only one woman on the board of Snap. We will never get to a point where some of the counter-winds that we face are recognised and some of the incredibly unpleasant behaviours nailed in engineering terms if we do not fight for more women to be at both board level and engineering level. What action can the UK take to build the role of women in the technology sector in this country? It is vital.
My final point concerns something we have under-egged in the debate today: I do not really believe that many countries understand the internet but I very much believe, as I said in a recent debate here, that Russia, China and North Korea do. We ignore that at our peril. They are the experts in social media. China has built a parallel internet, as we are all aware. They are now monitoring their own citizens, building huge profiles of them and will reward them in the future with services and different mechanisms to keep them incentivised to behave well. Yes, our UK issues are very important, but we are a minnow. The entire European tech sector is just 7% of that of the US. We have to keep focused on our role globally and the big geopolitical headwinds we face.
My Lords, the summing up of a debate such as this is always difficult and today it is almost impossible. I will not mention all the contributions—as I say, that would be impossible. I will mention three. I am delighted to be following the noble Baroness, Lady Lane-Fox. As she knows, I am one of her groupies in that I have looked to her for advice on this area since I was a Minister and she was part of an advisory group, which I confess I referred to in my private office as “Geeks Anonymous”. I am also thankful to the noble Baroness, Lady Kidron. I said after her amendment that hers was a parliamentary triumph and a game-changer. I believe that the Kidron amendment will be referred to time and again in the years to come as having changed the weather in how we approach this. Finally, the noble Lord, Lord Puttnam, has been my mentor and friend on these issues for 20 years and I am grateful that he has intervened again today. As for the rest of you, all I can possibly do is amend a saying beloved of our American friends: there has not been so much wisdom concentrated in one place since Thomas Jefferson dined alone.
My own mentor, Jim Callaghan, used to like to say, “A lie can be halfway around the world before truth has got its boots on”. Jim used to say that in the 1970s: now, of course, it is in nanoseconds, or whatever is the flash of light in terms of information. How our societies come to terms with what has been termed the fourth industrial revolution, the data revolution or whatever, will be one of the great challenges. Matthew Parris, who entered the Commons in the same 1979 intake as I did, wrote in the Times on 30 December:
“The internet is a jungle that can’t be tamed. It would be impossible to censor social media so we might as well embrace fake news and learn to ignore the insults”.
I admire Matthew Parris, both in his political career and as a journalist, but it is a thought with which I profoundly disagree. It is the task of this generation to bring the new technologies within the rule of law and of democratic accountability. Of course, I agree with the noble Baroness, Lady Harding, that we need to get the balance right and to make a proper judgment about benefits and real harm, but I also agree with her that saying no politely is not enough of a response for these social media giants.
I think that in many ways we are in the same position as politicians who had to face the massive changes of the industrial revolution, and in the United States the massive growth of corporations. The political systems showed the ability to tackle the big trusts, the monopolies and the health and safety, hours of work and all the rest. We must not preach a feeling of doom about this: they are not beyond our control, but when I say control, it is that light touch. When I was the Minister working on the general directive which is at the core of the Bill that was debated in the House yesterday, the British position was constantly to have light-touch regulation. We were mostly opposed by countries which only but recently had experience of a Stasi, or the power of an intrusive state, so I understand the balances and the discussions.
In many ways, some of the agonising in this debate today is always there in a liberal democracy—small “l”, small “d”. In liberal democracies, we agonise about what the limits of free speech are, and if we put limits on it, we worry about why we do so. In that respect, as I hinted in the debate yesterday, I am closer to the noble Lord, Lord Black, than might be imagined. I really am worried that these big companies can, as it were, asset-strip the communications industry in a way that undermines the ecology of all communication. I cannot remember which of the White Papers it was from some time during the 1980s or 1990s that talked about diversity, quality and choice as the aim of policy, as far as communications in its widest sense goes. I still believe that is important and that to have that diversity, quality and choice, we must make sure that our print media are not dramatically undermined. I took to heart what the news media associations said on how these new technology companies are undermining and weakening them.
I have also had the briefing, as most of those taking part probably have, from ITV, Channel 4 and Sky about the impact on them. Of particular interest, and an old concern of mine, is the BBC. If our communications ecology is under threat from these companies, it is more important than ever that we continue to support the BBC and the other public service broadcasters in the job that they do. We need to be careful that they are not undermined by what these tech companies are doing. This is not only in the provision of news but in the undermining of these companies in providing an underpinning of our cultural values, in the programmes they commission and the work they do. Ofcom has to take on its new responsibilities by not only regulating or oversighting the BBC but by defending it against unfair attacks. In this new age, a BBC dedicated to inform, educate and reform is more needed than ever.
I agree entirely with the point made by the noble Lord, Lord Puttnam, about education. Again, it is good that we have had this overlap between the Data Protection Bill, which we debated yesterday, and this debate. We are talking about getting ourselves ready for this transformation. I also agree with the noble Lord, Lord Inglewood, that it is as big a change as the invention of the printing press. It is a complete challenge to almost every sector of our society and if our democratic institutions are to be able to survive the assault that this era of rapid change brings to us, we will have to be ready. I will cite again that when the Education Act 1870 was passed, it was said that they had to educate their masters. Well, now we have to educate our population—not just our children but us all.
We have also got to educate ourselves. I was invited to a round table as an expert on the digital economy. I said that I am not an expert but a politician trying to learn about what this involves for our society. I strongly support the suggestion made by the noble Baroness, Lady Kidron, of an ad hoc committee of this House. I know that is not a ministerial responsibility. This debate is not an end in itself. It is part of a process of getting ourselves prepared and ready for some of the challenges that new technologies are going to bring to Parliament and to our society.
My Lords, the noble Baroness, Lady Jenkin, remarked that this is not the most computer savvy institution on the face of the globe, and no one is a better illustration of the truth of her statement than me. However, with the noble Baronesses, Lady Kidron, Lady Lane-Fox and Lady Harding, my noble friend Lord Knight and others around me, I feel I can learn so much, to pick up the noble Lord, Lord McNally, at the end of his remarks, on this important subject, and ask myself how we proceed with this complicated material and the urgent need to find ways of dealing with what is clearly a priority for our society.
It is clear that online companies are coming under increasing pressure to move beyond their claim to be merely platforms for a broad and broadening range of interest groups and individuals, a catalyst for the free flow of knowledge and ideas and a factor in the democratisation of information. It is true enough that in some measure all these claims can be justified and attested. A great deal of information, an endless amount of social activity and a flood of ideas have indeed been greatly enhanced by the emergence and development of social media. We have cause to wonder at these developments. As I sit with my grandson, I see him handling things that make me wonder whether I would not have preferred to have died before he was born since the learning curve is sometimes so steep.
As many noble Lords have said, we have all become increasingly aware of the unquantifiable amount of dark, intrusive, prurient and dangerous material that flows along with the helpful and hopeful stuff. Through conversations in our social lives, we can detect rising unease among the public that is reaching a point, if it has not already reached it, of wanting to hold the great cartel of corporations who pretty much hold us globally in thrall to account. I have been delighted to hear of the work Google is now doing on tackling hard issues, such as fake news, supporting high- quality journalism, fighting extremist and controversial content, promoting child safety and educational campaigns and protecting intellectual property. Long may it continue, but I hope we will not be seduced by these siren sounds. We must go on giving intense scrutiny to this, and that note of urgency has been sounded again and again from all parts of your Lordships’ House.
There are key questions that these corporations, and we, must face. The noble Lord, Lord Mitchell, referred to them being taxed appropriately, and we must not lose sight of that, but their accountability for the material they handle and enable to pass into the public realm also needs to be faced. I believe they should be treated as publishers, perhaps bearing in mind the distinction that can be drawn between anonymous and unattributable material and that which can be attributed to authors and sources. This is a key moment in the history of the fourth industrial revolution. Just as the grim factories and satanic mills of 19th century England eventually and sometimes painstakingly had to come to terms with their responsibilities for the health and well-being of their workers, as well as for their impact on the green and pleasant land around them, so now we must press for a similar development in the realm of social media.
It is not appropriate to simply denounce or demonise the digital media. There are much more epistemological and historical things at stake which make this an opportune moment for social media to come to a rotting carcass and make the most of it. Indeed, fake news is invading our intellectual landscape like Japanese knotweed, yet it would be wrong to identify social media as being entirely to blame for what one writer describes as,
“the crash in the value of truth … comparable to the collapse of a currency or a stock”.
That assertion was made by the respected journalist Matthew d’Ancona in his recent book Post-Truth, which ended up in my Christmas stocking this year. To take a word from the remarks of my noble friend Lord Puttnam, trustworthiness—trust and truth—has collapsed but this has been going on for a long time, the pace accelerating since the financial crash of 2008.
“There are no facts, only interpretations”,
said Friedrich Nietzsche. The Manic Street Preachers, a group from my native land, released an album that said, “This Is My Truth Tell Me Yours”. There is an individualisation of statements of truth, the disappearance of a meta-narrative and the evolution of a world in which we make our own truths and set our own standards. When I was studying theology 50 years ago, there was situation ethics—the ability to reach ethical conclusions according to your own lights and experience, not by subscribing to something that was generally approved. It is against those general factors that we must look at what is happening in the realm of social media and digital materials that we are talking about today.
There are worries. We need only flag up the dark area that emerges from the freedoms that we are now given. Terrorists, paedophiles and money launderers have all profited from them. The current television series “McMafia” is a perfect illustration of how wrong things can go, in the words again of Matthew d’Ancona, like,
“a runaway train, crashing through privacy, democratic norms and financial regulation”.
If there is any truth in this line that what we are looking at specifically in the realm of social media needs to be mapped against a more general philosophical and moral situation that is historical and has been developing for a long time, we need to look very carefully—I like the word “nuance”, which the noble Baroness, Lady Lane-Fox, used. If there is any truth in this argument, we must all do something about it. It is not just the Government of course; everybody interested has a job to do in cleaning this whole situation up.
Two very current issues add urgency to the consideration of these matters. It was good to hear the noble Lord, Lord Bew, and the right reverend Prelate the Bishop of Gloucester refer to the Committee for Standards in Public Life and its recent report. It is vital, as the committee put it, to convene a constructive and solutions-focused dialogue between the social media companies and the political parties. Other noble Lords have referred to this too. It seems that Twitter and Facebook have both confirmed their readiness to participate in such a dialogue; others have yet to come on board. We must hope that they will join the debate soon, and that there will be a fruitful and beneficial outcome to their discussions.
I finish with the issue of press freedom, which was debated with such passion yesterday. I was just a bystander listening to the entire debate, but there is something immensely sad about hearing two opposing cases being put with equal conviction—two admirable cases but each made, I kept feeling, from within locked rooms.
So much has happened in the world of communication since the publication of the Leveson report in November 2012, nowhere more so than in the area we are discussing. Then, the focus of the inquiry was mainly on newspapers and related outlets. Since then, the expansion of digital media has been truly exponential. Alongside these developments, we must hope to create an ethical framework that allows us to distinguish relative positions of rightness and wrongness and appropriate behaviour.
I wonder whether the Government would seriously consider creating a new category of services and make the digital companies responsible for the content that passes their way, to bring into being a code of practice for social media companies and underpin it in statute. I wonder whether there could not be brought into being an independent regulator with the power to oversee the system, investigate breaches and ultimately sanction non-compliant platforms, and whether there should not be a statutorily backed levy on social media companies to fund internet safety education. None of those recommendations is mine; they are made by Sky and already submitted for inclusion in the Government’s Green Paper on internet safety. Let us hope we are moving in the right direction.
We are very grateful to the noble Baroness, Lady Kidron, for starting this very rich debate. It is certainly not finished today.
My Lords, I am very grateful to all noble Lords for their interesting and succinct contributions—I know how difficult that is on a subject such as this. I very much support the noble Lord, Lord McNally, in his view that this is part of a process and that we will not provide all the answers tonight. I hope that I will answer some noble Lords’ questions, but there is an awful lot to get through. Of course I also thank the noble Baroness, Lady Kidron, for convening this debate.
There is a lot to cover, but I say at the outset that the Government get the message which I think the House in aggregate is giving today, which is that social media companies and the way they work have developed rapidly and that there are issues that need to be considered. I hope that I can show that we are taking that seriously.
For example, I think we all agree that the internet offers a huge range of opportunities but, as we have heard, there are legitimate concerns about illegal and harmful content online. A number of noble Lords have expressed specific concerns about the role that major social media platforms play. Because we are so acutely aware that the internet can have both a positive and a negative effect on users, particularly children, the Government have developed a clear ambition, as stated in our manifesto commitment, to make the UK the safest place in the world to be online. We aim to realise that ambition via policies developed through our new digital charter.
If I may initially confine myself to the noble Baroness’s Motion—which I know is not always the practice in this House—I must make the point that online platforms such as social media companies, auction sites and cloud service providers are, as has been alluded to, currently defined as information society services, as set out in European Union law. While we are still a member of the EU, the UK is subject to the e-commerce directive, but the directive was drafted in 2000, when the internet was in its infancy. The intention behind it was laudable: to create a regulatory environment in which cross-EU online commerce could flourish and prevent member states creating barriers to the growth of the digital single market. But since the turn of the century, digital technology has developed faster than society has adapted to that change, and citizens now have legitimate concerns about rising online threats. The noble Lord, Lord Bew, explained the dilemma that this creates.
Around the globe, it is now increasingly acknowledged that there are problems with online behaviour and content and that they must be addressed. The EU Commission recently published guidelines on how online platforms should increase the proactive prevention, detection and removal of illegal online content, and it is currently considering whether further action should be taken.
The noble Baroness, Lady Kidron, asked about legal liability structures to which I alluded. I thought it would be helpful to show what the new Secretary of State said in his evidence to the Select Committee on polling and digital media—the noble Lord, Lord McNally, might be interested in the philosophical nature of this. He said:
“The approach that we take as a whole to the internet and internet companies is encompassed in what we call the digital charter. Essentially, this is about changing the attitude towards what happens online from a libertarian view that the more people connect in the world, the better, and that Governments should have no view, which was probably the founding political philosophy of the internet, to a liberal values view whereby you support and promote the freedom that the internet brings while ensuring that that freedom does not trample on the freedom of others. That involves mitigating harms”.
We agree, and as long as the UK remains a member of the EU, and bound by its rules, we will work closely with the Commission, and other member states, to secure further progress in this area. Of course, consideration of online liability is fraught with complexities, not least because we will be leaving the EU. Similarly, an ill-considered approach might also produce technical problems for online service providers. If they were to become fully liable for all third-party content, this could be fundamentally prohibitive to many service models, including those operated by cloud storage providers, video-sharing sites and others. Balancing these various interests is a delicate matter, but essential if we are to meet safeguarding concerns for users while still supporting the internet as a useful vehicle for exchanging ideas and promoting the digital economy.
These points are not intended in any way to downgrade the importance of tackling online harms, but rather to outline the need for a well-developed and, if possible, consensual approach. The digital charter is our primary response to the more fundamental questions of ensuring that new technologies work for the benefit of everyone. The noble Lord, Lord Knight, talked about new policy thinking. While I talk about that, I remind noble Lords that we intend to set up a data ethnics and innovation body, and we have allocated £9 million in this budget to do that. It could consider things such as the verification ideas that the noble Baroness, Lady O’Neill, mentioned, and the suggestion of the noble Baroness, Lady Eaton, of an innovation fund, among many other things. We intend to develop the policies and actions to make the UK the safest place to be online, and to drive innovation and growth across the economy.
As the noble Baroness, Lady Lane-Fox, mentioned, that includes women, who are a valuable and essential resource. I am pleased to say that the Government are supporting the recently launched Tech Talent Charter, to which over 125 tech companies have already signed a pledge to take concrete measures to improve the gender diversity of their workforces.
What we are trying to achieve cannot be achieved by government alone. So we will work collaboratively with citizens, businesses, charities and others to build both our understanding of the challenges, and a consensus around the solutions. As I have mentioned, the challenges we face online are global. The international element was mentioned by the noble Baronesses, Lady Kidron, Lady Benjamin, and my noble friend Lord Inglewood, among others. It is at this global scale that we should be looking to gain consensus on our approach. We have already begun to hold international discussions on the key issues under the charter, including at the recent G20 digital taskforce in Hamburg. Going forward, we will look to expand this work, including bilaterally with like-minded countries such as France, and through multilateral organisations, including the OECD and the D5.
The very first element of the digital charter is our work on online safety, a reflection of how seriously the Government take this issue. In October, we published the Internet Safety Strategy Green Paper, an important next step in meeting our relevant manifesto commitments. The strategy set out our ambition for everyone to play a role in tackling online harms. For example, we are working closely with the Department for Education to ensure that online safety, which the noble Baroness, Lady Grey-Thompson, mentioned, is part of new compulsory relationships and sex education curriculums, and that parents have the support they need to keep their children safe. We will certainly pass on my noble friend Lady Eaton’s suggestions about innovation to the department.
In answer to the views of the noble Lord, Lord Puttnam, on understanding how these large sites work and what they do, we acknowledge his point. The Department for Education issued a call for evidence late last year to help shape the new content and guidance, and we expect the new curriculums to cover digital literacy and critical thinking skills. The noble Baroness, Lady Lane-Fox, had a debate recently about digital understanding, which was extremely useful and interesting. Alongside the publication of the strategy, a public consultation was launched, which asked for views on a range of new safety initiatives—this is the scope that the noble Baroness, Lady Kidron, asked about—that included a social media code of practice, a social media levy, and transparency reporting. The consultation closed on 7 December, with a good number of responses from a range of contributions.
The noble Lord, Lord Bew, mentioned the report of his Committee on Standards in Public Life, Intimidation in Public Life. I think that the right reverend Prelate the Bishop of Gloucester also mentioned that. We will address its recommendations in the government response, which is due to be published shortly.
As set out in the strategy, we are working with the main social media platforms on a voluntary basis because we believe that that secures faster results. However, the previous Secretary of State was crystal clear, and the new Secretary of State agrees, that we will not hesitate to bring forward legislation, if necessary. I hope that that commitment reassures my noble friends Lady Harding and Lady Fall. The age verification protections for online pornography show that we are willing to tackle online harms through legislative means. The internet safety strategy is not the only vehicle through which we will protect children online. I am very pleased to be responding to the noble Baroness, Lady Kidron, so soon after we worked together closely on securing improvements to the Data Protection Bill. I commend her persistence and firmness, but also her good humour. We have never once fallen out—yet!
I am sure that the House does not need reminding that the Government were pleased to support an amendment to that Bill to address the concerns of many noble Lords. We supported a statutory code of practice for age-appropriate design for all information society services. We look forward to working with the Information Commissioner’s Office to drive up the levels of protection afforded to children online.
Many noble Lords mentioned fake news. The Government are committed in their manifesto to protect the reliability and objectivity of information as an essential component of democracy. Work is now under way, also under the digital charter, to ensure that we have a news environment where accurate content can prevail. As my noble friend Lord Black said, it is the UK’s robust, free, wide, vibrant and varied media landscape that remains our key defence against disinformation.
I shall go through as quickly as I can some of the points that noble Lords have raised. The noble Baroness, Lady O’Neill, mentioned competition policy. We have a world-leading competition regime, and we will continue to keep it under review. The Competition and Markets Authority recently announced a new technology team to strengthen its ability to deal with competition issues surrounding algorithms, artificial intelligence and big data. We are also setting up a new Centre for Data Ethics and Innovation, as I mentioned, which will be well placed to support the CMA in its work.
The noble Viscount, Lord Colville, and the noble Baroness, Lady Lane-Fox, talked about trust in the media. We absolutely agree with the trusted role of the traditional media sectors in the UK, but I do not believe that the trust has been eroded quite as much as some may fear by the newer forms of content. A recent Radiocentre survey in 2017 on levels of trust in media sources among UK citizens found that 77% of respondents trusted radio news, 74%—just under three-quarters—trusted TV news and only 15% trusted news on social media. The public are not complete fools.
The right reverend Prelate the Bishop of Gloucester, who is my bishop, I might add—I do not mean “my” bishop; I must push on—asked why we are not establishing an independent digital commissioner. The Digital Minister, supporting the Culture Secretary, who is personally invested in raising the level of online safety, plays that convening role on this issue across government.
The noble Viscount, Lord Colville, talked about the filter bubble effect and I thank him for his interesting views on this. The Government consider the effect of news, advertising and other content being tailored algorithmically to personal preferences to be an issue. The work on the digital charter will consider this and what response is most appropriate.
I have an answer from the Box on the gambling questions that the noble Baroness, Lady Howe, asked. It says, “We will write”. It also says, “Wrong officials in the Box”. But, more seriously, I have a reply in train to the noble Baroness on this subject, following the recent debate that we had on gambling. That is ongoing and I will write to her.
The right reverend Prelate the Bishop of Gloucester talked about Germany’s new law, under which social media companies are fined for not removing hate speech on their services quickly enough. We are aware of that and, through the digital charter, we will look right across the range of potential solutions for tackling that issue. We will look at steps that other countries, including Germany, are taking to inform this work.
The noble Baroness, Lady Worthington, and the noble Lord, Lord Vaux, asked what we are doing about online anonymity, which I think is an interesting point. We asked questions about online anonymity in the internet safety strategy consultation and we are analysing those responses. We will formally respond to them soon.
Online advertising was raised by my noble friend Lord Black and the noble Lord, Lord Vaux. They were right to raise the role of online advertising in many of the issues discussed. We have a good advertising regulatory system but we recognise that there have been rapid developments in the marketplace. We are working alongside the Advertising Standards Authority to monitor developments and respond appropriately. This is a key part of much of our work under the digital charter, including ensuring that there are sustainable business models for high-quality online news media, protecting people’s personal data and ensuring that value created online is rewarded appropriately.
The noble Baronesses, Lady Benjamin and Lady Howe, asked about the BBFC and age verification in social media. The age verification regulator will not duplicate the Internet Watch Foundation’s remit. If, in the course of investigations, the age verification regulator identifies child abuse images hosted in the UK, it will report these to the IWF. We recognise the concerns about the availability of pornographic material on some social media platforms but, as we discussed during the passage of the Bill, it is our intention for age verification to apply to the pornography industry. Within the regulator’s powers will be the ability to notify ancillary service providers, including the social media platforms, if, for example, a person is using a social media platform to market their non-compliant website.
I want to end by repeating that the Government are very much concerned about the impact that online harms are having, particularly on young people and children. That is why we are launching a range of initiatives to keep people safe online—and we agree with the noble Baroness, Lady Grey-Thompson, that social media sites should take responsibility. Social media platforms should be aware that, if we do not get results, we will not be afraid to go further.
I would like to be the first member of the data users union proposed by the noble Lord, Lord Knight. I hope that other noble Lords would like to join me. It is an excellent idea. In my excitement at starting the debate I forgot to declare my interests as set out in the register, including that as founder of 5Rights.
I think it is fair to say that you could characterise our feeling towards online services and social media as a combination of loving someone who is behaving badly, the frustration with an 18 year-old who does not quite know that they have grown up and is supposed to behave in a different way, and a palpable fury at corporate indifference on certain important subjects. However, it is too easy to just look at it that way. All those things are true and I share all those views, and I thank all noble Lords who have spoken in this fantastically interesting and progressive debate. However, what I heard most was our failure to articulate an ethical standard by which we want the industry to behave so that we can then meet it halfway. That was what came out of today’s excellent debate—questions of democracy, accountability, transparency, monopoly, tax regimes, codes of conduct and global consensus on governance. These are matters for society. If we are to have the society that lives by the values we want, we have to show leadership. I say to the Minister that I think the Government are showing leadership, which I welcome. I again thank all noble Lords for their contributions. This has been, by all standards, a wonderful debate.