I beg to move,
That this House has considered cyber troop activity in the UK.
It is a pleasure to serve under your chairmanship, Sir Charles. I secured this debate because I feel that we cannot go into another round of elections in May with our heads in the sand about a very real threat to our democracy: industrialised disinformation by state and political actors. Across political divides we must stand against the forces that seek to smear, manipulate, speak untruths and undermine the legitimacy of Governments or political opponents, through underhand and under-regulated techniques.
Politics, by its nature, will always host opposing and differing views, and that is absolutely right, as is the opportunity to debate the points around these views, but it is incumbent on all of us to ensure that the public can have confidence in the information that they see presented from politicians and those who report on political events. The old adage that a lie can get halfway around the world before the truth gets its boots on certainly applies here, but at a whole new industrialised level, with mass distribution only ever a mouse click away.
Social media has been both a blessing and a curse. It is, in theory, a great leveller, providing an open platform for the discussion of ideas and helping the disadvantaged to organise groups to get their voices heard. It opens up publishing to citizen journalists, speaking without gatekeepers. However, in many ways, instead of widening the debate, it has become increasingly polarised and dominated by echo chambers, with information provision ruled by mysterious algorithms. The lack of editorial content control has created a nightmare for fact checking and fairness, and increasing numbers of nefarious actors have learned how to manipulate the system, fuel conspiracy theories and sow division. The waters have become murky and it is a pool in which many people no longer want to swim.
It cannot be dismissed simply as modern-day political spin. The new technologies create far more poisonous possibilities for the most Machiavellian practitioners of the dark arts, and there is plenty of evidence that they are taking advantage of these new superpowers. Those who want to see standards and integrity in public life maintained cannot simply stand by and ignore it.
Millions are being spent on orchestrated disinformation in what the Electoral Reform Society described as the unregulated “wild west” of online political campaigning. Organised cyber-troop operations use an increasingly sophisticated armoury to alter the nature and course of legitimate political debate, to smear and discredit opponents, to interfere in foreign affairs and generally to create distrust in the very processes on which democracy relies. Facts get confused, opposing points of view are tainted and people are turned off by an onslaught of hate, misleading propaganda and deliberately divisive content.
Techniques used by these cyber-troops include armies of trolls or political bots amplifying particular opinions or hate speech, harassment, doxxing, smearing, doctoring images and videos, mass reporting of content and illegally harvesting data to micro-target with misleading information. They do it because it works.
Fergus Bell, the co-founder of London-based media consultancy Fathm, has worked on many elections and believes that false information shared online has been “very successful” at swaying voters. It does not have to be direct in its influence but, as he says,
“if you cause division between people, or if you can change someone’s mind on one tiny thing that might make them vote differently, you can push an election”.
The cyber-troops have precise, data-driven strategies to home in on the soft spots, and they know exactly where those are.
People who are targeted by these tactics may be disenfranchised by the processes, become disillusioned with everyone involved in politics and no longer bother to participate in democracy. In some cases, this appears to be the purpose of cyber-troop activities, as Channel 4 reported in the US elections, where they found evidence of micro-targeting by the Trump campaign to deter 3.5 million black Americans from voting at all. That type of voter suppression should alarm us all.
The rapid rise of disinformation industries is evidenced in the Oxford Internet Institute’s report, “Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulation”. It is quite a wake-up call for those who think that these things could not happen or do not happen here. The report found that 81 countries are now using social media to spread computational propaganda and disinformation about politics, including the UK, which is a jump from 70 countries in the previous year.
The report found evidence of Chinese, Russian and Iranian-backed disinformation campaigns about covid-19 to amplify anti-democratic narratives and undermine trust in health officials. Microsoft has also warned that hackers operating out of Russia, China and Iran were targeting the staff and associates of both Donald Trump and Joe Biden ahead of the US election last year. In Argentina, a “deepfake” video was used to manipulate the Minister of Security to make her appear drunk.
As for China, a 2017 Harvard paper estimated that the Chinese Government employ 2 million people to write 448 million social media posts a year. The primary purpose of this activity is to keep online discussions away from sensitive political topics. Closer to home, the long-delayed Russia report from the Intelligence and Security Committee confirmed that there was “credible open source commentary” suggesting that Russia tried to influence the Scottish independence referendum and subsequent elections. Yet astonishingly it seems that the Government have not yet sought to find evidence of interference in the EU referendum and instead took an ostrich-like approach to defending our democratic process. At the very least, I would hope that the Government could be looking to implement the recommendations of the ISC report.
It is not just foreign interference that is at stake here; the UK has to get its own house in order. There are questions about data-driven profiling and Facebook advertising by political actors in the UK. In the 2019 general election, 90% of the Conservative party’s Facebook advertisements in early December were labelled as misleading by Full Fact. The real danger of this kind of misleading content is that cyber-troop tactics can then be used to amplify it to the extent that, by the time it is rebutted, it has already reached thousands of feeds. The Conservatives even tried to rebrand their Twitter output during a debate as coming from “factcheckUK”, changed its logo to hide its political origins and pushed pro-Conservative material in a way that deliberately confused it with independent fact-checking sites.
Another question is why Topham Guerin, one of the communications companies behind the 2019 campaign, was awarded a £3 million covid-19 contract by the Government. It is yet more evidence of the need for my Ministerial Interests (Emergency Powers) Bill, which aims to hold the Government to account, to be supported in all quarters of the House—but that matter is for another day.
Although it is not always clear who is behind these actions, there is always clear evidence of bots being used to swell numbers artificially and drive political positions. A study by the Institute for Strategic Dialogue identified that almost all of the 10 most active accounts on Twitter discussing the Brexit party appeared to be automated bots, while prior to the 2019 general election a report found that a third of the Prime Minister’s own Twitter followers were bots.
Tackling this issue is not about silencing voices; it is about getting back some semblance of a level playing field, recognising the range of genuine voices and turning down the noise from the fakes. The UK is one of 48 countries identified in the Oxford report where cyber-troop manipulation campaigns are being run by private firms on behalf of Government or political actors. The report found that almost $60 million had been spent on hiring these firms since 2009, but I suspect that this figure is only the tip of the iceberg. There needs to be greater transparency and a tightening of the links between the public sector and private contractors.
Cyber-troops sometimes work in conjunction with civil society organisations, internet subcultures, youth groups and fringe movements; groups who may be motivated by genuinely held beliefs but whose causes may ultimately be damaged by those who strategically spread disinformation or computational propaganda. Take, for example, Turning Point, a right-wing youth pressure group. A US Senate report found that its social media activity was regularly co-opted and reposted by the Internet Research Agency, which is known in Russian slang as the “trolls from Olgino”.
The use of third-party campaigning organisations can also be a way to rig the system—to channel illegal levels of funds and campaigns, or at the very least to exploit gaps in our outdated electoral laws in order to press political agendas. Many questions have rightly been asked about the official Vote Leave campaign’s techniques, their links to other groups, the “dark money” spent and their micro-targeting techniques, used in breach of privacy laws.
As the Vote Leave campaign demonstrated, tougher rules are needed in the conduct of future referenda, as well as elections. The Scottish Government introduced the Referendums (Scotland) Act 2020 to better regulate the conduct of any future referendum, where they have the power to do so, including on campaign spending and donations. I would like to see further action to tighten the rules in this place too.
Fighting cyber-troops is complex and has to be tackled on several fronts, with governments, civil society, academia and technology businesses all having a role to play. The social media giants must certainly be better regulated and take greater responsibility for what is published. I therefore welcome the moves to improve regulation through the online safety Bill.
However, the misinformation and disinformation being propagated by cyber-troops is clearly an ongoing and growing aspect of online harms, so it is disappointing that this aspect has not been robustly tackled through these proposals. There are half-hearted plans from the Government for digital imprints, which is a move in the right direction, towards greater transparency, but it does not go far enough or fast enough. The get-out clause, which is that the imprint can be located in an
“accessible alternative location linked to the material”,
is not good enough.
Online political advertising remains largely unregulated, and there is nothing from the Government so far that shows a determination to better regulate against indecent and dishonest material, dark ads or data targeting. At the very least, we need to see who is using citizens’ data and why, as well as why people see particular ads. I believe that, on this front, the European regulatory plans go further than those of the UK.
I am aware of the challenges with regulating and fact-checking political content, but it is not impossible to overcome these, and it is essential that this is looked at urgently. It is no longer enough simply to rely on a sense of fair play and “a fair crack of the whip for all sides” to manage the truth amidst the overwhelming barrage of information being dumped upon us. There is no chance for rebuttals from opponents when so much content can spread so widely and maliciously, without any clarity or transparency on the sources.
It is not enough to treat the threat of cyber-troops as solely an electoral phenomenon. The Government’s counter-disinformation unit is usually only operational during periods of heightened vulnerability, when we know that cyber-troops are working to sow division and discord every minute of every day.
Much needs to be done to reform the rules, strengthen democracy and restore faith in our democratic processes, yet there has been disappointingly slow progress so far. Many organisations, such as Reset and the Fair Vote Project, are working on this alongside the all- party parliamentary groups on electoral campaigning transparency and digital regulation and responsibility. They are doing the research and taking forward proposals on a cross-party basis, so a lot of the heavy lifting has already been done on the Government’s behalf.
However, the Government have given no indication that they collect data on cyber-troop activity, despite the important role that they should be playing in analysing and assessing this threat. When I have raised questions about cyber-troops, I have been advised, in response, that the Government’s fine-sounding “defending democracy programme” is tackling this. However, from what I have found so far, it does not seem to be doing very much. Perhaps the Minister can point me to something other than that when she responds today.
We need to stop kicking this into the long grass. There is plenty of evidence of the threats from both within and outwith the UK. I have previously called for a debate, in Government time, on the need for electoral reforms to protect free and fair elections. However, if I cannot have that, we need to have it moving forward on another basis.
This is not a party political issue; it is about integrity in public life. Political differences are healthy, as is debate, but the tactics of division and disinformation from cyber-troops are a cancer on all political discourse, and it is spreading too fast to ignore. We all have a moral imperative to take action, and I call on this Government to do so.
I thank the hon. Member for Midlothian (Owen Thompson) for setting the scene so well. He knows that we do not agree on everything—far from it—but there are many things that we do agree on, and I echo his concerns. This is not about hearing a point of view that we may disagree with; it is about whether something is right; whether it is true.
I was looking through The Times today, and one of the stories refers to fake news and also a fake review, where facts are disputed and questioned. Fake news, as the hon. Gentleman referred to, can suddenly become the perceived truth when quite clearly it is not. I remember many years ago, when I was a young boy, some people at school telling me that if you tell a lie often enough, people will believe it. Whether that is true or not, I suspect that sometimes it is true. People tell a story or a so-called fact over and over, and suddenly somebody will say it is true. That worries me greatly.
The hon. Member for Midlothian referred to voter suppression, and he mentioned the United States of America as an example. What happens in America very often ends up happening here—it is said that when America sneezes, we catch a cold. If that is right, then we need to be really on top of what is happening. The hon. Gentleman referred to three countries, but I will refer to four. Other countries that are very much involved in voter suppression, fake news and telling the truth in a way that suits their political ambitions are Russia, China, Iran and North Korea. In the press a few months ago they were talking about the ability that North Korea now has to do this as well.
I think the Government really need to be on top of this and know what has been put out as wrong and untruthful, and respond to it in a really positive fashion. I have done the armed forces parliamentary scheme over the years. The last time we did it was with the Royal Air Force and the first couple of times was with the Army. Last time, even with those few years of difference between when I did it way back in 2012 and 2013 and again in 2018, I could see how the role of the Ministry of Defence and the RAF was changing, even compared with just four or five years ago. I just wanted to highlight that. I very much look forward to the Minister’s response—I say that nicely, but she knows I mean it—which I hope will give us the important reassurance that we seek.
I do not want to say much more but I will refer to a couple more things if I may. Misinformation can be a danger. A comment deliberately taken out of context can and has caused irreparable harm. The good book—the Bible—says that the word is mightier than the sword. It certainly is. It can hurt more. Surgically, the sword can bleed you, but words spoken out of tune, out of place and hurtfully can strike deeper to the heart than anything else. I am always very aware of that as well.
I support the notion of combating this at Governmental level, which is why I look to the Minister for a positive and helpful response. However—I know the hon. Member for Midlothian will understand my point—neither can we be in the position of becoming the guardian of speech. Sir Charles, you are one of those who believes in free speech, and I believe that we must remain free; we must possess the ability to have opposing views, and a way that we can agree to differ and still be friends at the end of it. That is always what I look to do in the comments that I make. We must possess the ability to have opposing views and state them in a non-threatening factual way, with the truth very much in place.
I watched the polarisation that took place in the United States over the last election, and in this nation in reference to Brexit. I am a Brexiteer, and I am glad that we are out of the EU—as a Northern Ireland MP, I know there are obviously issues with the deal, but I am glad that we are out—but how much of that was due to the influences of a variety of forms of social media?
The hon. Member for Midlothian referred to social media, which we all know can be a plus, but it can also be an absolute curse that can destroy people and carry all the wrong things. We all know friends, including colleagues in my party, who have been trolled, as I have been. Some of the comments are absolutely despicable My staff probably try to protect me from it, which, by the way, is not a bad thing, because an ill spoken word can be mightier than the sword.
We need to watch our words and ensure that our truth does not eclipse the truth. When I say, “our truth”, I do not mean my truth or the hon. Gentleman’s truth; I mean someone putting out what they refer to as “the truth” when it is not. The balance will be hard to find, but I believe that he, like me, wants to find that balance. That is the thrust of what he said, and I support that. I encourage the Government to use publications such as the “Industrialized Disinformation: 2020 Global Inventory of Organized Social Media Manipulation” report, published by the University of Oxford, along with other evidence to find an informed and balanced way forward.
The hon. Gentleman referred to integrity and said that debate is healthy. So it is. I am always happy to speak to anyone who has a different point of view from me because there is no threat in that, but we should be able to debate in a healthy and constructive way and, at the end of it, still be able to go our different ways, perhaps still with our own points of view.
I finish with a biblical quote as I sometimes like to do in debates, and I hold strongly to this. I was sat here, thinking:
“the truth will set you free”.
I knew that from an early age as a young boy in the children’s meetings in my village and back home in Ballywalter. It is true in political life, it is true in social life and it is true in everything. The truth will set you free. We need to hold to the truth. I very much look forward to the Minister’s response.
It is a pleasure to see you in the Chair this evening, Sir Charles, and to serve under your chairmanship for this debate secured by my hon. Friend the Member for Midlothian (Owen Thompson).
Disinformation, and state-sponsored disinformation campaigns in particular, is an issue close to my heart, and I know it is close to my hon. Friend’s heart. Disinformation represents a growing threat, as was adumbrated in the Oxford Internet Institute’s report he referenced in his opening remarks. It is not just that there are now more so-called cyber-troops working on disinformation campaigns but that they are growing in sophistication, the amount of money being spent on that around the globe has grown into the many millions, and the threat is going only in one direction.
As we know from the events in Capitol Hill in January, disinformation has to radicalise only a relatively small percentage of the population to be a serious and violent threat not just to others in society but to democracy itself. Of course, there are countless examples of that in history throughout the world. We can even look—if anyone cares to—at the example of the bronze soldier of Tallinn in Estonia in 2007.
The hon. Member for Strangford (Jim Shannon) is right to highlight that this is not about regulating people’s opinions and views. He is a staunch Unionist who I have a lot of respect for, and I am a staunch supporter of Scottish independence. It is an idea that has been around since around 843 AD and is a perfectly mainstream view to hold, albeit that I accept it is not held by a majority of those appearing in the debate this evening. However, the Scottish National party recognises how we in particular can be targeted, to be used as a means to sow division, through hostile actors weaponising a mainstream, legitimate idea. We do not want that to happen. We want our debates to be conducted entirely properly.
I want to call for a couple of things. Members of Parliament need a greater understanding of the threat picture. When we talk about the sophisticated network of cyber-troops, exactly what does that mean? I ask the Government to facilitate briefings on the threat picture. I also want us to have a national strategy to counter disinformation. It should build information resilience, and not just among young people in schools—important though that is. The strategy should reach every part of the population. The pandemic has surely shown us why that is important.
Lastly—we could go on much longer, I am sure—I plead with the Government, while accepting that it is not the departmental responsibility of the Minister: the ISC recommendations in the Russia report must be implemented. There is agreement across all the parties in the House on that, and the Government must implement those recommendations, disinformation being one of the many areas where not just the UK but many open societies are vulnerable. There is a good discussion that we could have, that would be free of party political heat, to ensure that we do that, and build a resilient democracy that allows ideas to be debated and to flourish as they should in any free, open society. I think we can all agree on that.
In these one-hour debates the two Opposition Front Benchers get five minutes each, but, since we have a bit of time, if the Labour shadow Minister would like to take a little longer, she can. I hope that all Ministers’ and shadow Ministers’ offices make them aware of the rules on Westminster Hall debates and timings; but please, shadow Minister, have a little longer. It is not as if we are short of time at the moment.
Thank you, Sir Charles. I greatly appreciate that, but in any case I would like to start by saying what a pleasure it is to serve under your chairship, and what appropriate discretion you show.
I congratulate and thank the hon. Member for Midlothian (Owen Thompson), who called today’s debate. The internet plays an ever-increasing role in our lives, as the pandemic has shown. Our work, social and family lives, and our Parliament, are all largely online, and we get our news from online sources and develop our politics digitally. Research by Ofcom shows that Facebook now rivals ITV as the second most popular news source for UK adults. That is reflected at party level. In 2018 the UK’s three largest parties spent £3.7 million on Facebook advertising alone, which, as Members have observed, is largely unregulated.
We live in a digital age and there are, unsurprisingly, perhaps, emerging digital threats to our democracy, so today’s debate is overdue and I am concerned that the Government are weak on online protections, and are presiding over continuing delays to the online safety Bill as well as failing to decisively oversee the impact of Huawei on our telecoms networks.
As we have heard, cyber-troops are Government military or political party teams, committed to manipulating public opinion over social media. I want to congratulate the hon. Member for Midlothian again. I was the first Member of Parliament to mention the internet of things in the House. As I understand it, he is the first Member of Parliament to mention cyber-troops in a debate, and in congratulating him—it is, as we have heard, a very important subject—I wonder why the Government are not bringing forward or raising issues of this importance. It has been 11 years since the earliest reports of organised social media manipulation in 2010, which coincided with the first Conservative Government after Labour.
Members have mentioned the research by the University of Oxford that found that cyber-troop teams use a variety of strategies, tools and techniques, but they often have an overarching communications strategy that involves creating official Government applications, websites or platforms for disseminating content, using accounts that are either real, fake, or automated to interact with users on social media, or creating substantive content such as images, videos or blog posts—fake images, as well. They do not have to be sinister or abusive in themselves: for example, Israel deploys the policy of positive interactions with social media users who are critical of the Government, and the Czech Republic seeks to provide neutral fact-checking services. However, negative strategies are used, as we have heard.
These Government-sponsored accounts are not always run directly by the services whose message they are spreading, nor is that message always obvious. The University of Oxford research found one Russian cyber-trooper who ran a fortune-telling blog that provided insight into relationships, weight loss, feng shui, and just occasionally geopolitics, with the goal of weaving propaganda seamlessly into what appeared to be the non-political musings of an everyday person. The serious point is that this can be very hard to detect by the most informed consumer.
With the vast range of activities, sources and strategies deployed by cyber-troops, it is important to keep up with the changing geopolitical landscape, so I ask the Minister to tell me what assessment has been made of these measures deployed by cyber-troops across the world, and specifically what the impact is on UK citizens and our democracy. The emergence of these strategies and threats is a threat to our democracy, which emphasises the importance of collaboration with our friends and allies to fully map the threats that we face. We have heard of the well-publicised Russian disinformation campaign in the US 2016 election, as well as our lack of resilience, and even as our service personnel are mobilised to help contain the pandemic, our adversaries are feeding disinformation and division into our communities. That shows how essential public understanding is in a crisis, and that the enemies of democracy will exploit every weakness. The Government launched the armed forces cyber-regiment last year. That is good, but I ask the Minister why it took so long, and whether we have already been exposed to digital hostility.
The hon. Member for Strangford (Jim Shannon) emphasised the changing role of the armed forces, and I pay tribute to his work with the armed forces. What discussions has the Minister had with the Secretary of State for Defence specifically on cyber-troops? There can be confusion between the responsibilities of the National Cyber Security Centre and those of Ofcom in this key area, so could she outline where the demarcation is there? As we have heard, cyber-troops operate primarily on social media: Facebook, Google, YouTube, Instagram, Reddit, Twitter, and all parts of our social lives, as well as our work lives and consumer lives. However, we have little control over how that content is curated. Facebook itself says that millions of users have been exposed to coronavirus disinformation, so I ask the Minister why we are so slow to introduce any effective regulation of online content, when the online safety Bill will be before us, and whether it will include anything to address cyber-troops.
The hon. Member for Midlothian also highlighted the impact of disinformation on our democracy specifically. What discussions has the Minister had about the need to reform our electoral system to protect it from foreign interference, as the Electoral Commission and the Law Commission have set out in a number of reports? Does she accept that cyber-troops warrant reform to our electoral laws, and what recommendations does she have, if any?
It is harder than ever to trust what we see online, with fake images pretending to be from reputable sources such as the BBC and so on. As we have heard, the Government were themselves guilty of that during the 2019 general election, when they changed their official social media channels to appear as unbiased, neutral fact-checkers. The bots may be following the Prime Minister, but is the Prime Minister following the bots?
So far, the Government appear content to leave those issues to the market, but it is a market that has allowed the spread of disinformation, opening the door to cyber-troops. Self-regulation by the social media giants has failed, and they have made little progress. Twitter has begun checking some of its posts, and Mark Zuckerberg has rolled back on his declaration that Facebook would not become the arbiter of peace, but that is too little, too late, and there is not enough progress. Will the Minister explain why the only requirement that she places on those platforms is that they should not make money from disinformation? Surely there has to be a higher standard than not directly profiting from it.
Finally, anonymity is a complex issue, but the sheer scale of misinformation, online abuse and extremism means that there has to be more we can do. We recognise anonymity as a shield for whisteblowers, victims finding refuge online, or children in minorities exploring self-expression, but how does the Minister see the relationship between anonymity and the work of cyber-troops? Is she looking at the trade-off of protecting privacy and free speech—the hon. Member for Strangford talked about the importance of free speech, and I echo that point—and protecting our democracy and citizens from harm and abuse? Inaction is to make the worst trade-off of them all.
This Government have been in place, in one form or another, since 2010, and in that time we have seen a dramatic change in the prominence and the role that the online world plays in our lives, our democracy, our news and our understanding of the world, yet we have seen no action from the Government. Understanding mapping and measuring the impact of cyber-troops on UK citizens is an action that any responsible Government should be taking. We have heard today about the ways in which cyber-troops are deployed, controlled and developed, but those strategies will not stay the same—they will continue to evolve—and we are not even playing catch-up, because we do not seem to be in the game at all. I am really pleased that those key points have been raised in the debate, and I hope that the Minister will set out in her response the action that the Government will take.
It is a great pleasure to serve under your stewardship, Sir Charles. I join everyone else in thanking the hon. Member for Midlothian (Owen Thompson) for bringing forward this really important topic. I know that he has long been a really powerful and strong voice on the subject, and he is absolutely right to keep bringing attention to the issue, because the worrying industrialisation in disinformation is something that we should all be concerned about.
A number of Members have spoken about the increasing sophistication of digital technology. Even this week there was a deepfake of Tom Cruise on TikTok; it was incredibly lifelike and plausible and was not intended for sinister purposes. That only underlines what is the art of the possible if that technology is in the hands of those who are up to no good. It has always been vital that UK citizens have access to accurate information when it comes to elections but also situations such as our current pandemic; it is vital to our democracy and everyday life as well. Disinformation and mis-information, which is spread without intention, threaten our democratic freedoms and can cause harm to individuals and society, and it is an issue that the Government take incredibly seriously.
That is why we established a dedicated counter-disinformation unit, which brings together cross-Government monitoring and analysis capabilities to build a comprehensive picture of disinformation and misinformation. It works with partners to ensure appropriate action is taken. The hon. Gentleman rightly said that this unit that has generally been stood up at elections; it was stood up during the European parliamentary elections, the UK general election in 2019 and again in March last year to respond to the covid pandemic, and it remains operational. The component parts of the unit remain operational all the time—organisations such as the 77th Brigade, for example.
Throughout the pandemic particularly, the unit has been working closely with social media platforms to quickly identify potential harmful content on their platforms and help them respond to it. We have seen major platforms update their terms of service and introduce new measures to tackle disinformation and misinformation related to covid-19. This is not just about not being able to profit from it; a really important part of the agreement is that they also put up links to reliable, Government-backed sources of information. We welcome this, and there is clearly more to do. We continue to put pressure on platforms to ensure that their policies and enforcement are fit for purpose, while respecting freedom of expression. The unit also works with Government communications teams to ensure that public communications and community engagement address false information where appropriate to do so.
I thank the Minister for her comments and the information she is giving. Before she moves on from the unit for disinformation, when I asked recently how many full-time employees it had, the answer was none. She has talked about how spread out it is, but given the increased importance of disinformation, will there be full-time employees in the unit or will they all have other things to do as well as disinformation?
The hon. Gentleman is absolutely right. We are working closely with the police and also the Army, as I have mentioned. I am always slightly nervous about what I am allowed to say around this issue, not being an MOD Minister, but there is the 77th Brigade, which is a military unit dedicated to this sort of activity and with which we work very closely.
While such information can come from a range of sources, we know that certain states routinely use it as a tool to exploit our open system by sowing division and undermining trust in our democracy, as the hon. Gentleman said. This can be through disinformation, cyber-attacks and other methods. We have made it clear that any foreign interference in the UK’s democratic process is absolutely unacceptable—it does not even need to be said—and it is, and always will be, an absolute priority to protect the UK against it. The UK, along with our G7 and NATO partners, is working hard to protect our democracy against disinformation as we work together to tackle the shared threat of covid-19.
We remain firmly committed to protecting our democratic values and our electoral processes, which I know the hon. Member for Midlothian is concerned about, and we have robust systems in place to protect the UK against foreign interference. As he says, it is all about working collaboratively. These systems bring together Government, civil society and private sector organisations to monitor and respond to interference in whatever form it takes. The hon. Member for Newcastle upon Tyne Central (Chi Onwurah) talked about these things sometimes coming in the guise of something that could look quite harmless but can actually be incredibly sinister.
It is absolutely vital to ensure that our democracy stays open, vibrant and transparent. The Government are strengthening our legislative framework, enhancing capabilities and engaging with partners to expand our efforts to ensure the maximum impact. That joined-up approach is supported through the defending democracy programme, based in the Cabinet Office, which provides a strategic co-ordinating forum, drawing together work and expertise across Departments on a number of fronts to protect democratic processes, strengthen the integrity of elections, encourage respect for open and safe democratic participation, and promote open, fact-based discourse.
The Government are taking steps to strengthen elections by introducing legislation, as the hon. Member for Midlothian said, to ensure that the framework is fit for the modern age, for example by updating online campaigning rules. In May 2019, the Government committed to introducing a digital imprints regime, which will inform voters about the source of online campaign material. In August, we launched a technical consultation on this proposal. It closed in November, and further details will be set out shortly.
During major democratic events, the Government stand up an election cell—a co-ordinated structure that works with relevant organisations to identify and respond to emerging issues and protect the safety and security of the democratic process. The counter-disinformation unit works closely with the election cell, co-ordinating the Government’s operational response to any evolving threat of disinformation and other forms of online manipulation. The Government are working really closely with partners to support the delivery of safe and inclusive elections. Of course, the next ones will be very shortly, in May.
The Government welcome the valuable analysis and insight from academia, including the Oxford University report, and we take seriously the findings of other experts in this field. Countering disinformation and other forms of manipulation requires a whole-of-society approach, and the Government are working closely with the Oxford Internet Institute and other stakeholders from civil society, academia and industry to much better understand the issues in this space. In particular, last year the Government launched a counter-disinformation policy forum, bringing together key actors in industry, civil society and academia to improve responses to misinformation and disinformation and, crucially, to prepare for future threats. This forum contributes to the collective understanding of challenges to the information ecosystem, allows us to improve the responses that our organisations can deliver to better mitigate evolving threats posed by false narrative and helps us to prepare for future advances in technology, which is of course what we are all really worried about; as we have already said, the technology evolves rapidly.
We are entering a new age of accountability for the tech industry. The hon. Member for Midlothian and others mentioned the online safety legislation. We announced plans at the end of last year for a groundbreaking rulebook that will make tech companies responsible for tackling harmful content on their sites. This new regulatory framework will give digital businesses much more robust rules of the road, as it were, so that we can seize the brilliance of modern technology to improve our lives while protecting children, building trust and, crucially, tackling criminal activity online.
The full Government response to the online harms White Paper was published at the end of last year and set out how the proposed legal duty of care on online companies will work in practice. It will of course defend freedom of expression and the role of the free press. The new laws will also ensure appropriate checks and balances on platforms’ power over public discourse and will promote a thriving democracy where pluralism and freedom of expression are protected. The laws will have robust and proportionate measures to deal with misinformation and disinformation. That is crucial, because we know that they can cause significant physical or psychological harm to an individual. An example is the anti-vax falsehoods that we are seeing around covid-19 at the moment. Crucially, the Bill will give Ofcom the tools it needs to understand how effectively disinformation is being addressed. That will be done through transparency reports, and then it can take action in the appropriate way, as required.
As the hon. Lady knows, the NCSC is not a regulator, but it provides authoritative advice, and the online harms response says very clearly that it is vital that Ofcom is able to take advice, if necessary, from experts in whatever field, whether civil society, charities, academia or businesses. They will have to work together very collaboratively, because it is Ofcom’s job to hold companies to account to ensure that this issue is being tackled appropriately.
It is important to say that we really do support freedom of expression as a fundamental right. It is an essential element of the full range of human rights. Therefore, while we take action to address false narratives online, we have to remain committed to protecting the freedom of expression that we are so well known for, across our nations. However, our commitment to tackling misinformation and disinformation in all their forms remains an absolutely key priority. Our challenge as a society is to help to shape the internet so that it remains open and vibrant but still protects users from all kinds of harm. It is a really difficult balance to strike, but our commitment to protecting our democratic freedoms and processes from outside interference by any actor, whether state or non-state, remains unwavering.
Thank you, Sir Charles. I will briefly thank all hon. Members for their contributions this afternoon. I think we have seen a very clear understanding that it is in all our interests to ensure that we tackle this issue and get it right. I very much endorse the comments of my hon. Friend the Member for Glasgow South (Stewart Malcolm McDonald) about seeking a strategy, because we are starting to see a swell of opinion for tackling some of these things, especially misinformation online. We have seen the importance of that through the current pandemic. The public need to be able to have confidence in the information that they access.
In a nutshell, the issue comes back to what the hon. Member for Strangford (Jim Shannon) said. He very ably made the point that it is so important that we are able to agree to disagree. I do not think that anybody is suggesting that we need to have any kind of thought control or that everybody has to have the same opinions. It is important that we do not, but it is important also that we can have confidence that those views and opinions are presented in a way that is accurate and factual.