Skip to main content

Surveillance Camera Code of Practice

Volume 818: debated on Wednesday 2 February 2022

Motion to Regret

Moved by

That this House regrets the Surveillance Camera Code of Practice because (1) it does not constitute a legitimate legal or ethical framework for the police’s use of facial recognition technology, and (2) it is incompatible with human rights requirements surrounding such technology.

Relevant document: 23rd Report from the Secondary Legislation Scrutiny Committee

My Lords, I have raised the subject of live facial recognition many times in this House and elsewhere, most recently last November, in connection with its deployment in schools. Following an incredibly brief consultation exercise, timed to coincide with the height of the summer holidays last year, the Government laid an updated Surveillance Camera Code of Practice, pursuant to the Protection of Freedoms Act 2012, before both Houses on 16 November last year, which came into effect on 12 January 2022.

The subject matter of this code is of great importance. The last Surveillance Camera Commissioner did a survey shortly before stepping down, and found that there are over 6,000 systems and 80,000 cameras in operation across 183 local authorities. The UK is now the most camera-surveilled country in the western world. According to recently published statistics, London remains the third most surveilled city in the world, with 73 surveillance cameras for every 1,000 people. We are also faced with a rising tide of the use of live facial recognition for surveillance purposes.

Let me briefly give a snapshot of the key arguments why this code is insufficient as a legitimate legal or ethical framework for the police’s use of facial recognition technology and is incompatible with human rights requirements surrounding such technology. The Home Office has explained that changes were made mainly to reflect developments since the code was first published, including changes introduced by legislation such as the Data Protection Act 2018 and those necessitated by the successful appeal of Councillor Ed Bridges in the Court of Appeal judgment on police use of live facial recognition issued in August 2020, which ruled that that South Wales Police’s use of AFR—automated facial recognition—had not in fact been in accordance with the law on several grounds, including in relation to certain convention rights, data protection legislation and the public sector equality duty.

During the fifth day in Committee on the Police, Crime, Sentencing and Courts Bill last November, the noble Baroness, Lady Williams of Trafford, the Minister, described those who know about the Bridges case as “geeks”. I am afraid that does not minimise its importance to those who want to see proper regulation of live facial recognition. In particular, the Court of Appeal held in Bridges that South Wales Police’s use of facial recognition constituted an unlawful breach of Article 8—the right to privacy—as it was not in accordance with law. Crucially, the Court of Appeal demanded that certain bare minimum safeguards were required for the question of lawfulness to even be considered.

The previous surveillance code of practice failed to provide such a basis. This, the updated version, still fails to meet the necessary standards, as the code allows wide discretion to individual police forces to develop their own policies in respect of facial recognition deployments, including the categories of people included on a watch-list and the criteria used to determine when to deploy. There are but four passing references to facial recognition in the code itself. This scant guidance cannot be considered a suitable regulatory framework for the use of facial recognition.

There is, in fact, no reference to facial recognition in the Protection of Freedoms Act 2012 itself or indeed in any other UK statute. There has been no proper democratic scrutiny over the code and there remains no explicit basis for the use of live facial recognition by police forces in the UK. The forthcoming College of Policing guidance will not satisfy that test either.

There are numerous other threats to human rights that the use of facial recognition technology poses. To the extent that it involves indiscriminately scanning, mapping and checking the identity of every person within the camera’s range—using their deeply sensitive biometric data—LFR is an enormous interference with the right to privacy under Article 8 of the ECHR. A “false match” occurs where someone is stopped following a facial recognition match but is not, in fact, the person included on the watch-list. In the event of a false match, a person attempting to go about their everyday life is subject to an invasive stop and may be required to show identification, account for themselves and even be searched under other police powers. These privacy concerns cannot be addressed by simply requiring the police to delete images captured of passers-by or by improving the accuracy of the technology.

The ECHR requires that any interference with the Article 10 right to freedom of expression or the Article 11 right to free association is in accordance with law and both necessary and proportionate. The use of facial recognition technology can be highly intimidating. If we know our faces are being scanned by police and that we are being monitored when using public spaces, we are more likely to change our behaviour and be influenced on where we go and who we choose to associate with.

Article 14 of the ECHR ensures that no one is denied their rights because of their gender, age, race, religion or beliefs, sexual orientation, disability or any other characteristic. Police use of facial recognition gives rise to two distinct discrimination issues: bias inherent in the technology itself and the use of the technology in a discriminatory way.

Liberty has raised concerns regarding the racial and socioeconomic dimensions of police trial deployments thus far—for example, at Notting Hill Carnival for two years running as well as twice in the London Borough of Newham. The disproportionate use of this technology in communities against which it “underperforms” —according to its proponent’s standards—is deeply concerning.

As regards inherent bias, a range of studies have shown facial recognition technology disproportionately misidentifies women and BAME people, meaning that people from these groups are more likely to be wrongly stopped and questioned by police and to have their images retained as the result of a false match.

The Court of Appeal determined that South Wales Police had failed to meet its public sector equality duty, which requires public bodies and others carrying out public functions to have due regard to the need to eliminate discrimination. The revised code not only fails to provide any practical guidance on the public sector equality duty but, given the inherent bias within facial recognition technology, it also fails to emphasise the rigorous analysis and testing required by the public sector equality duty.

The code itself does not cover anybody other than police and local authorities, in particular Transport for London, central government and private users where there have also been concerning developments in terms of their use of police data. For example, it was revealed that the Trafford Centre in Manchester scanned the faces of every visitor for a six-month period in 2018, using watch-lists provided by Greater Manchester Police—approximately 15 million people. LFR was also used at the privately owned but publicly accessible site around King’s Cross station. Both the Met and British Transport Police had provided images for their use, despite originally denying doing so.

It is clear from the current and potential future human rights impact of facial recognition that this technology has no place on our streets. In a recent opinion, the former Information Commissioner took the view that South Wales Police had not ensured that a fair balance had been struck between the strict necessity of the processing of sensitive data and the rights of individuals.

The breadth of public concern around this issue is growing clearer by the day. Several major cities in the US have banned the use of facial recognition and the European Parliament has called for a ban on police use of facial recognition technology in public places and predictive policing. In response to the Black Lives Matter uprisings in 2020, Microsoft, IBM and Amazon announced that they would cease selling facial recognition technology to US law enforcement bodies. Facebook, aka Meta, also recently announced that it will be shutting down its facial recognition system and deleting the “face prints” of more than a billion people after concerns were raised about the technology.

In summary, it is clear that the Surveillance Camera Code of Practice is an entirely unsuitable framework to address the serious rights risk posed by the use of live facial recognition in public spaces in the UK. As I said in November in the debate on facial recognition technology in schools, the expansion of such tools is a

“short cut to a widespread surveillance state.”—[Official Report, 4/11/21; col. 1404.]

Public trust is crucial. As the Biometrics and Surveillance Camera Commissioner said in a recent blog:

“What we talk about in the end, is how people will need to be able to have trust and confidence in the whole ecosystem of biometrics and surveillance”.

I have on previous occasions, not least through a Private Member’s Bill, called for a moratorium on the use of LFR. In July 2019, the House of Commons Science and Technology Committee published a report entitled The Work of the Biometrics Commissioner and the Forensic Science Regulator. It repeated a call made in an earlier 2018 report that

“automatic facial recognition should not be deployed until concerns over the technology’s effectiveness and potential bias have been fully resolved.”

The much-respected Ada Lovelace Institute has also called for a

“a voluntary moratorium by all those selling and using facial recognition technology”,

which would

“enable a more informed conversation with the public about limitations and appropriate safeguards.”

Rather than update toothless codes of practice to legitimise the use of new technologies like live facial recognition, the UK should have a root and branch surveillance camera review which seeks to increase accountability and protect fundamental rights. The review should investigate the novel rights impacts of these technologies, the scale of surveillance we live under and the regulations and interventions needed to uphold our rights.

We were reminded by the leader of the Opposition on Monday about what Margaret Thatcher said, and I also said this to the Minister earlier this week:

“The first duty of Government is to uphold the law. If it tries to bob and weave and duck around that duty when it’s inconvenient, if Government does that, then so will the governed and then nothing is safe—not home, not liberty, not life itself.”

It is as apposite for this debate as it was for that debate on the immigration data exemption. Is not the Home Office bobbing and weaving and ducking precisely as described by the late Lady Thatcher?

My Lords, the noble Lord, Lord Clement-Jones, has given an eloquent exposition of the reasons for supporting his Motion of Regret. The Motion refers to the ethical and human rights considerations that attach to the use of surveillance camera technology, and it is to those two considerations that I shall address my remarks. I especially draw the Minister’s attention to the Amnesty International report of 3 June 2021 about the use of surveillance technology in New York, to which the noble Lord referred, and also to the serious civil liberty questions that that report raised. Concerns were raised in Japan on 28 December, in Yomiuri Shimbun, and in the Financial Times on 10 June, about Chinese technology in Belgrade, and on the Asia News Monitor in November 2021 in a report from Thailand about mass surveillance against Uighurs in Xinjiang, as well as a report in the Telegraph of 1 December, in which the head of MI6, Richard Moore, said that

“technologies of control … are increasingly being exported to other governments by China—expanding the web of authoritarian control around the planet”.

It is not just control—it is also a keystone in the export of truly shocking crimes against humanity and even genocide. Just a week ago, we marked Holocaust Memorial Day, on which many colleagues from across the House signed the Holocaust Memorial Day book or issued statements recommitting to never allowing such a genocide to happen ever again. Yet, sadly, in 2022, as the Foreign Secretary has said, a genocide against the Uighur Muslims is taking place in Xinjiang. As I argued in our debate on Monday, we are doing far too little to sanction those companies that are actively involved, or to regulate and restrict the facial recognition software that has allowed the Chinese state to incarcerate and enslave more than a million Uighurs.

In the 1940s, we did not allow the widespread use of IBM’s machines, or other tools of genocide used in Nazi Germany and manufactured by slave labour in factories and concentration camps, to be sold in the United Kingdom. Today we find ourselves in the perverse situation of having Chinese surveillance cameras with facial recognition software being used in government departments, hospitals, schools and local councils as well as in shops, such as Tesco and Starbucks. It is an issue that I doggedly raised during our debates on the telecommunications Bills that have recently been before your Lordships’ House. As I said in those debates, a series of freedom of information requests in February 2021 found that more than 70% of local councils use surveillance cameras and software from either Dahua Technology or Hikvision, which are companies rightly subject to United States sanctions for their involvement in the development and installation of technology and software that targets Uighur Muslims. Nevertheless, these companies are free to operate in the United Kingdom.

So much for co-ordinating our response with our Five Eyes allies, which was the subject of one amendment that I laid before your Lordships’ House. Far from being a reputable or independent private company, more than 42% of Hikvision is owned by Chinese state-controlled enterprises. According to Hikvision’s accounts, for the first half of 2021, the company received RMB 223 million in state subsidies, while the company works hand in glove with the authorities in Xinjiang, having signed five public-private partnerships with them since 2017. What is perhaps just as disturbing are the recent reports in the Mail on Sunday that Hikvision received up to £10,000 per month of furlough money from United Kingdom taxpayers from December 2020 until February 2021. How can it be right that, at a time when the US Government are sanctioning Hikvision for its links to Uighur concentration camps, the UK Government are giving them taxpayer money and Covid furlough funds?

It is clear that the introduction and use of this type of facial recognition software technology by the police needs substantial regulation and oversight, especially because of the dominance of sanctioned Chinese companies in the UK surveillance market. Hikvision alone has nearly 20% of the global surveillance camera market. Hikvision is working hard to penetrate and dominate the UK surveillance technology sector. In May 2021, it launched a consultant support programme and demonstration vehicles so it could bring its technology

“to all parts of the United Kingdom”.

In October, it became corporate partner in the Security Institute, the UK’s largest membership body for security professionals, and it has launched a dedicated UK technology partner programme. All of this deserves further investigation by our domestic intelligence services.

I agree with the noble Lord, Lord Clement-Jones, a long-time friend, that the surveillance camera code of practice is insufficient in addressing legitimate human rights concerns around the widespread use of such technology. Nor do I think this technology is conducive or necessary for the police to maintain public safety or to tackle criminal enterprise. Despite some of the recent instances of poor policing in this country, on the whole we have a better culture of policing—a point that the Minister often makes, and I agree with her—that recognises the balance between protecting public order and serving the community, certainly in comparison with many other developed countries. We should therefore be cautious about importing both the technology and the tactics of authoritarian regimes. After all, these tactics and technology come from countries that do not have the rule of law and seek to maintain the power of their regimes through a mixture of brutality, fear and the regular incarceration of their people, which is made all the easier by the unregulated use of facial recognition software and surveillance.

More broadly, the Government need to look seriously at banning the participation of Hikvision, Dahua Technology and other sanctioned companies from the UK market. We should emulate the USA and Australia, which have recognised not only the human rights concerns but the national security concerns regarding these cameras. Those countries are actively removing Hikvision cameras from public buildings.

The UK should also introduce its own entities list, which would include sanctions and investment bans against Chinese companies actively involved in the construction and maintenance of the concentration camps in Xinjiang. That would include the likes of Hikvision, Dahua Technology, SenseTime and the audio recording company iFlytek. In particular, it is particularly unacceptable that Legal and General, the largest pension fund manager in the UK, continues to have holdings in iFlytek.

The Minister should explain to the House why Hikvision was able to access the UK furlough scheme, what efforts the Government will take to recoup taxpayers’ money that has gone to Chinese companies sanctioned for their involvement in genocide—an issue raised by the noble Lord, Lord Agnew, during his recent resignation statement—and why Hikvision has not been banned here, as it has in the US.

It is clear that there needs to be legislation to regulate the use of facial recognition software and surveillance technology in the United Kingdom. I strongly agree with the recommendation made earlier by the noble Lord, Lord Clement-Jones, referring to his Private Member’s Bill. I urge the Minister to work with colleagues to bring forward legislation in this area at the earliest opportunity. She should rest assured that if the Government do not, I am sure that noble Lords such as the noble Lord, Lord Clement-Jones, will continue to press for legislation to that effect.

As a society, we must work harder to repudiate those few misguided individuals who seek to import and expand the use of Chinese facial recognition technology, software and tactics in the UK. Those who think that China’s social credit system or mass surveillance system are benign, or at the very least economically beneficial, need to have their heads examined. They may couch these policies in the language of technological progress but, as history has shown, such intrusive mass-surveillance systems have always been the handmaiden of fascism.

I hope that the Minister will carefully respond to what I have said today and that all of us can work to regulate the use of this technology and to make the presence of Hikvision and Dahua Technology in the UK history.

My Lords, as expectations of privacy are lower in public places than at home, overt surveillance, such as by street cameras, is generally seen as a lesser intrusion into our liberties than either covert surveillance by intelligence agencies—the subject of my 2015 report, A Question of Trust—or so-called surveillance capitalism, the monitoring and monetising of our personal data by big tech. However, that assessment has been cast into doubt by automatic facial recognition and similar technologies, which potentially enable their users to put a name to every person picked up by a camera, to track their movements and to store images of them on vast databases that can be efficiently searched using AI-driven analytics.

Those databases are not all owned by the police: the company Clearview AI has taken more than 10 billion facial images from public-only web sources and boasts on its website that its database is available to US law enforcement on a commercial basis. This technology, part of the information revolution in whose early stages we now find ourselves, can now more be stopped than, two centuries ago, could the steam engine, but, as has been said, the abuses of overt surveillance are already obvious in the streets of China and Hong Kong. To show the world that we are better, we must construct for those who wish to use these powers, as our forebears did in the Industrial Revolution, a democratic licence to operate.

We start in this country with a number of advantages. We have a strong tradition of citizen engagement and, as the noble Lord, Lord Alton, said, a culture of policing by consent. We inherited strong data protection laws from the EU and we still have legislation that gives real protection to human rights. We even had—almost uniquely in the world—a Surveillance Camera Commissioner, Tony Porter. I pay tribute to the extraordinary work that he did, on a part-time basis and without any powers of inspection, audit or sanction, including the issue of a 70-page document with detailed recommendations for police users of this technology.

I regret that the Surveillance Camera Code of Practice is, by comparison, a slim and highly general document. It is not comparable to the detailed codes of practice issued under the Investigatory Powers Act 2016 and overseen by the world-leading Investigatory Powers Commissioner’s Office. The designated bodies which must have regard to it are confined to local authorities and policing bodies; they do not include, as the noble Lord, Lord Clement-Jones, said, health, education or transport providers, private operators or, indeed, the Government themselves. Consultation on the latest version made no attempt to involve the public but was limited to statutory consultees.

The recent annual report of Tony Porter’s impressively qualified but thinly spread successor, the Biometrics and Surveillance Camera Commissioner, Fraser Sampson, commented that his formal suggestions for the code were largely dismissed as being “out of scope”. He added:

“That my best endeavours to get even a sentence reminding relevant authorities of the ethical considerations were rejected on the grounds that it would be too burdensome is perhaps an indication of just how restrictive this scope—wherever it is to be found—must have been.”

I do not know whether the highly general provisions of the code will be effective to improve local policies on the ground and ensure the consistency between them that my noble and learned friend Lord Etherton and his colleagues gently pointed out was desirable in their judgment in the Bridges case. In the absence of an IPCO-style inspection regime, perhaps we never will know. I suspect that the need not to stifle innovation, advanced in the code as a justification for its brevity, is a less than adequate excuse for the failure to do more to develop the code itself against a changing legal and technological background.

The words of the Motion are harsher than I would have chosen but, as the Snowden episode a few years ago showed, public trust in these increasingly intrusive technologies can be suddenly lost and requires huge effort to regain. I hope that the next revision of this code will be more energetic and ambitious than the last.

My Lords, it is a pleasure to follow three incredibly distinguished speakers in this debate. With reference to the remarks of the noble Lord, Lord Clement-Jones, attributed to the Minister, I must say that if this is a subject for geeks, I am delighted to join the band of geeks.

I fear I shall demonstrate a level of ignorance tonight, because I am a newcomer to the debate. In fact, I emailed the noble Lord, Lord Clement-Jones, earlier today because I had only just realised that it was taking place tonight. I am also speaking in a hybrid capacity—I now understand the true meaning of “hybrid”—so my opening remarks will be personal, but for those that follow, I will need to declare an interest, so I shall do so in advance of making those remarks.

In my opening remarks I have to say just a few things that demonstrate what a parlous state we are in as a country in terms of respect for human rights. The level of permissiveness in the capture—state capture, policy capture—of institutions that operate in authoritarian regimes, a list of which the noble Lord, Lord Alton, has given us, is truly staggering. We bang on about how fantastic our sanctions regime is, and so on, yet these companies, many of them Chinese, as the noble Lord described, operate here with complete impunity and we seem entirely content to allow them to do so, while we also recognise, in our foreign policy statements, that some of these countries have very ignoble intentions towards any freedom-loving democracy. I know the noble Baroness represents the Home Office, but I hope it is something the Government at large will take account of, because commercial surveillance, commercial espionage, commercial authority and commercial capture of the economy are all things we need to be incredibly vigilant about. One needs only to look at Russia’s capture of the German political debate, through Nord Stream 2, and what we are facing now with the Ukraine issue, to understand what is being discussed here by the noble Lord, Lord Alton.

Those are my general remarks. My remarks on it as chair of the Equality and Human Rights Commission now follow. There, I have to say to the noble Lord, Lord Clement-Jones, that I am so relieved he managed to secure this regret Motion. Articles 8, 9, 10, 11 and 14—the general article against discrimination—of the European Convention on Human Rights are engaged in this, so the fact that we get a document as thin as this is truly remarkable. I understand why only statutory bodies were consulted—it was a means for the Government to get it through in six weeks without being very concerned about broader concerns—but it is regrettable. The Bridges case directly engaged the public sector equality duty. The Equality and Human Rights Commission is the regulator of the public sector equality duty, yet the idea that it was not consulted, post the judgment, on how we might strengthen the code in light of that judgment is a matter of great deep regret to me.

I have a couple of points on the code. In paragraph 10.4 we are told that effective review and audit mechanisms should be published regularly. The summary of such a review has to be made available publicly, so my question to the noble Baroness is: why only a summary? In the interests of transparency and accountability, it is essential that these bodies regularly give a full explanation of what they are doing. The public sector equality duty requires legitimate aims to be addressed objectively, verifiably and proportionately. We, the public, will not be capable of assessing whether those tests have been met if there is only an executive summary to go by.

My other point concerns section 12.3, “When using a surveillance camera” and so on. The third bullet point requires “having due regard” and states that

“chief police officers should … have regard to the Public Sector Equality Duty, in particular taking account of any potential adverse impact that the LFR algorithm may have on members of protected groups.”

Again, no practical examples are provided in this rather thin document. We know from publishing statutory codes that the public, and even the bodies that use this technology, want practical examples. A code is effective, of value and of use, to the providers as well as the public, only when it gives those practical examples, because you cannot test the legal interpretation of those examples until you have that evidence before you.

We, the EHRC, have been unable at short notice to assess whether the code is in compliance with the Bridges judgment—I wonder, myself, whether it is—but we do not take a clear position on the legality of the revised code, and I should say that in clarification. However, we have recommended previously that the Government scrutinise the impact of any policing technologies, in particular for the impact on ethnic minorities, because we have a mountain of evidence piling up to say that they discriminate against people of darker skin colour.

We wanted mandatory independent equality and human rights impact assessments. These should ensure that decisions regarding the use of such technologies are informed by those impact assessments and the publication of the relevant data—this takes me back to my point about executive summaries—and then evaluated on an ongoing basis, and that appropriate mitigating action is taken through robust oversight, including the development of a human rights compliant legal, regulatory and policy framework. That is in conformity with our role as a regulator. We have recommended that, in light of evidence regarding their inaccuracy, and potentially discriminating impacts, the Government review the use of automated facial recognition and predictive programs in policing, pending completion of the above independent impact assessments and consultation processes, and the adoption of appropriate mitigation action. We await action from the Government on the basis of this recommendation.

In concluding, I want to share the new workstream that we will have in our new strategic plan, which we hope will be laid before Parliament in the next couple of months. In that plan, we will be influencing UK regulatory frameworks to ensure that equality and human rights are embedded in the development and application of artificial intelligence and digital technology. That is a change from the past and a new innovation, and we intend to take it extremely seriously. We will take enforcement and other legal action, so that the use of AI in recruitment, in policing and in other employment practices does not bias decision-making or breach human rights. I look forward to the Minister’s response.

My Lords, I first congratulate the noble Lord, Lord Clement-Jones, on securing this debate. Obviously, all who have spoken deserve a response to the points they have raised, but I am particularly interested in what the reply will be to the noble Baroness, Lady Falkner of Margravine, who asked who was and who was not consulted and why. The point she made there most certainly deserves a response from the Government.

The Surveillance Camera Code of Practice was first published in June 2013 under provisions in the Protection of Freedoms Act 2012. It provides guidance on the appropriate use of surveillance camera systems by local authorities and the police. Under the 2012 Act these bodies

“must have regard to the code when exercising any functions to which the code relates”.

As has been said, the Government laid an updated code before both Houses on 16 November last year and, as I understand it, the code came into effect on 12 January this year. The Explanatory Memorandum indicates that changes were made mainly to reflect developments since the code was first published, including changes introduced by legislation such as the Data Protection Act 2018 and those arising from a Court of Appeal judgment on police use of live facial recognition issued in August 2020, which was the Bridges v South Wales Police case.

Reporting the month before last, our Secondary Legislation Scrutiny Committee commented that the revised code reflects the Court of Appeal judgment

“by restricting the use of live facial recognition to places where the police have reasonable grounds to expect someone on a watchlist to be”

and added that the technology

“cannot be used for ‘fishing expeditions’”.

The committee continued:

“The Code now requires that if there is no suggested facial matches with the watchlist, the biometric data of members of the public filmed incidentally in the process should be deleted immediately. Because the technology is new, the revised Code also emphasises the need to monitor its compliance with the public sector equality duty to ensure that the software does not contain unacceptable bias. We note that a variety of regulators are mentioned in the Code and urge the authorities always to make clear to whom a person who objects to the surveillance can complain.”

As the regret Motion suggests, there is disagreement on the extent to which the code forms part of a sufficient legal and ethical framework to regulate police use of facial recognition technology, whether it is compatible with human rights—including the right to respect for private life—and whether it can discriminate against people with certain protected characteristics. Interpretations of the Court of Appeal judgement’s implications for the continued use of facial recognition technology differ too.

As has been said, the use of facial recognition is a growing part of our everyday lives—within our personal lives, by the private sector and now by the state. It can be a significant tool in tackling crime but comes with clear risks, which is why equally clear safeguards are needed. It appears that our safeguards and understanding of and frameworks for this spreading and developing technology are largely being built in a piecemeal way in response to court cases, legislation and different initiatives over its use, rather than strategic planning from the Government. Parliament—in particular MPs but also Members of this House—has been calling for an updated framework for facial technology for some years, but it appears that what will now apply has finally come about because of the ruling on the Bridges v South Wales Police case, rather than from a government initiative.

The police have history on the use of data, with a High Court ruling in 2012 saying that the police were unlawfully processing facial images of innocent people. I hope the Government can give an assurance in reply that all those photos and data have now been removed.

While a regularly updated framework of principles is required, as legislation alone will struggle to keep up with technology, can the Government in their response nevertheless give details of what legislation currently governs the use and trials of facial recognition technology, and the extent to which the legislation was passed before the technology really existed?

On the updates made to the code, it is imperative that the technology is used proportionately and as a necessity. What will be accepted as “reasonable grounds” for the police to expect a person to be at an event or location in order to prevent phishing exercises? As the Explanatory Memorandum states:

“The Court of Appeal found that there is a legal framework for its use, but that South Wales Police did not provide enough detail on the categories of people who could be on the watchlist, or the criteria for determining when to use it, and did not do enough to satisfy its public sector equality duty.”

Can the Government give some detail on how these issues have now been addressed?

A further area of concern is the apparent bias that can impact this technology, including that its use fails to properly recognise people from black and minority-ethnic backgrounds and women. That is surely a significant flaw in technology that is meant to recognise members of our population. We are told that the guidance now covers:

“The need to comply with the public sector equality duty on an ongoing basis through equality impact assessments, doing as much as they can to ensure the software does not contain unacceptable bias, and ensuring that there is rigorous oversight of the algorithm’s statistical accuracy and demographic performance.”

What does that look like in practice? What is being done to take account of these issues in the design of the software and in the datasets used for training for its use? What does ongoing monitoring of its use and outcomes look like? The Secondary Legislation Scrutiny Committee raised the question of who a person should direct a complaint to if they object to the use of the technology, and how that will be communicated.

We have previously called for a detailed review of the use of this technology, including the process that police forces should follow to put facial recognition tools in place; the operational use of the technology at force level, taking into account specific considerations around how data is retained and stored, regulated, monitored and overseen in practice, how it is deleted and its effectiveness in achieving operational objectives; the proportionality of the technology’s use to the problems it seeks to solve; the level and rank required for sign-off; the engagement with the public and an explanation of the technology’s use; and the use of technology by authorities and operators other than the police.

What plans do the Government have to look at this issue in the round, as the code provides only general principles and little operational information? The Government previously said that the College of Policing has completed consultation on national guidance which it is intended to publish early this year, and that the national guidance is “to address the gaps”. Presumably these are the gaps in forces’ current published policies. What issues will the national guidance cover, and will it cover the issues, with great clarity and in detail, which we think a detailed review of the use of this technology should include and which I have just set out? Unfortunately, the Explanatory Memorandum suggests that neither the College of Policing national guidelines nor the updated code will do so or indeed are intended to do so.

My Lords, I thank the noble Lord, Lord Clement-Jones, for securing this debate and all who spoke in it. Let me clarify that when I referred to those who are interested and knowledgeable about LFR as “geeks”, it was meant as a compliment. Sometimes it is difficult to get people to be interested in some of the things that we do in the Home Office. I am also grateful to the noble Lord for putting on record his views on the revised code, which came into force on 12 January of this year. I understand that it was published in full, and there is more detail in accompanying documents, including the College of Policing guidance and ICO guidance.

As I think the noble Lord, Lord Clement-Jones, said, the code was established in 2013 during the coalition Government under PoFA—the Protection of Freedoms Act 2012—to provide guidance to local authorities and the police on the appropriate use of surveillance camera systems.

Surveillance in schools is not really for the surveillance camera code of practice. Private use, which the noble Lord also talked about, is of course a DCMS matter. I am not trying to pass the buck, but it is not unusual for people to get those mixed up. In fact, that goes to the heart of what the Government are trying to do—namely, to try to simplify the landscape, which is all too often far too complex.

The principles in the code enable the police and local authorities to operate surveillance cameras in a way that complies with the breadth of relevant law in this area. Because the code is principles-based rather than technology-specific, it has remained largely up to date despite the pace of technological advancement in this area. Therefore, the changes do not increase the scope of the code or, indeed, its intended impact.

There have been a number of legislative developments and a key court ruling since the code was first published, which noble Lords referred to. The reason for updating the code was to reflect those changes, while we also took the opportunity to make the text easier for users to follow at the same time.

The consultees were mainly among policing and commissioners, including the Information Commissioner’s Office. The Surveillance Camera Commissioner published the draft, so it was in the public domain, and civil society groups commented on it, including the NPCC.

It is fair to say that the public expect the police to use technologies such as surveillance cameras to keep them safe. The Government and the police have a joint responsibility to ensure that they do so appropriately, while maintaining public trust—I think the noble Lord, Lord Anderson, talked about public trust. There are now real opportunities to make use of facial recognition to improve public safety. As has been mentioned, generations of police officers have used photographs of people to identify suspects and, more recently, CCTV images have been a vital tool in investigations. There are so many examples where suspect images have been matched to wanted known individuals, ensuring that they cannot evade justice when they cross force boundaries. What is changing is the ability to use computers to match images with increasing confidence and at speed, as well as combining technologies such as surveillance cameras and facial recognition to greater effect.

I shall mention a few examples. LFR trials have resulted in 70 arrests, including for suspected rape, robbery and violence, false imprisonment, breach of a non-molestation order and assault on the police. At a Cardiff concert there were no reported mobile phone thefts when South Wales Police used LFR, where similar concerts in other parts of the UK resulted in more than 220 thefts. South Wales Police produced around 100 identifications a month through retrospective facial recognition, thereby reducing identification time from 14 days to merely hours, which can be critical when dangerous individuals are at large.

Noble Lords talked about lawfulness and I will refer, as noble Lords have done, to the Bridges case. He claimed that his privacy rights were breached on two occasions when he passed in front of cameras during South Wales Police’s LFR trials. The Court of Appeal found several things: that there was a sufficient legislative framework to cover policing use of LFR but that the police had too much discretion on the who and where questions. The College of Policing national guidance is addressing those, to provide consistency. South Wales Police failed to take reasonable steps to demonstrate the potential for bias in facial matching algorithms, although the court found no evidence of it.

The court also helpfully confirmed—which goes to the question of the noble Lord, Lord Rosser—that the police have common law powers to use LFR and, by implication, other novel technologies. The Data Protection Act is relevant but under legislation for operating “in accordance with law”, published police policies constitute law for these purposes, and the use of LFR was proportionate. Whether something is proportionate is a judgment, not a simple mathematical calculation—for example, by multiplying the privacy impact on a person bringing a claim by the total number of people impacted. Also in answer to the noble Lord, Lord Rosser, LFR deletes the biometrics of those not matched instantaneously.

The noble Lord, Lord Clement-Jones, and others asked about the accuracy of LFR. The accuracy of any technique will depend on the technology and how it is used. Facial recognition systems are probabilistic; they suggest possible matches, not definite ones. The technology is increasingly becoming more accurate. There will always be false alerts and—here is the crucial point—that is why a human being always takes the final decision to engage with an individual matched by the technology. On bias, it is very important that the police comply with the public sector equality duty to maintain police confidence. South Wales Police and the Met have found no evidence of bias in their algorithms and, as I said, a human operator always takes that final decision.

I should make the point that the US National Institute of Standards and Technology found that the NEC had

“provided an algorithm for which the false positive differential was undetectable”,

and that the algorithm

“is on many measures, the most accurate”

the NIST has evaluated. It was developed using the same technology and training data set as the algorithm used by South Wales Police and the Met. South Wales Police have since upgraded to an even more accurate algorithm. I hope I have demonstrated both the comprehensive legal framework and the common law powers.

The noble Lord, Lord Alton, made a very interesting point about systems that might be brought in that might raise ethical consideration for the operation of those systems. The FCDO and the Cabinet Office will issue new guidance to enable buyers to more effectively exercise their discretion to exclude suppliers linked with modern slavery and human rights violations. The public procurement Bill will further strengthen the ability of public sector bodies to disqualify suppliers from bidding for contracts where they have a history of misconduct, including forced labour and modern slavery. But I thank him for bringing the issue, which is of great concern to many people, to noble Lords’ attention.

The Human Rights Act, the Equality Act and the Data Protection Act are all parts of the consideration that the police must give when exercising their new powers. The code actually references all those pieces of legislation. The police are also subject to regulation, particularly through the Information Commissioner’s Office, a range of oversight bodies and other bodies providing guidance and support. We helped the police appoint a chief scientific adviser, and forces have access to further support from their own ethics committees, the Police Digital Service and the College of Policing.

We have been working with police to clarify the circumstances in which they can use live facial recognition and the categories of people they can look for. The College of Policing is planning to publish that national guidance in due course. It is an important part of our democratic process that people can raise and debate legitimate concerns about police use of new technologies, including in Parliament. I do welcome the opportunity, on the noble Lord’s Motion, for us to be able to do so today. The updated surveillance camera code references the Bridges judgment and requires the police to comply with it. The more detailed guidance the court called for will be set out in the College of Policing’s national guidance.

I will finish by saying the police do have a duty to make use of technologies to keep the public safe, and the Government are committed to empowering them to do so while maintaining public trust. Updating the code to reflect the latest legal position is just one of the steps we are taking to achieve those important aims.

Before the Minister sits down: is the issue of live facial recognition and its use by the police a matter for the police and crime commissioner to decide or for the chief constable to decide?

It would usually be a matter for local forces in the context in which they are deploying it. In terms of the seniority of the officer who can authorise it, I do not know, actually. I just know it is a matter for local forces to decide when and for what purpose they are using it. But I can write to the noble Lord about that.

I take it, since the noble Baroness did make a reference to democracy and democratic accountability, that surely, at the very least, since the police and crime commissioner is elected and accountable, it must be a decision for a police and crime commissioner, rather than a police constable who is not elected and not accountable in that way.

The PCCs clearly have oversight of what their police forces are doing, and I would be most surprised if the PCC was removed from that sort of operational context.

The noble Baroness was good enough to reference the statement from the FCDO. Would she be willing to take back to it the specific point I raised this evening about the company Hikvision, which is banned in the United States because of security, human rights and civil liberties concerns, and all the other things I said? I hope, therefore, that the noble Baroness will feel able to ask the FCDO why it has been banned in the US on the same intelligence we have, but not in the United Kingdom.

I referenced this without mentioning the company’s name. I recognise the seriousness of the issue and I will take the point back.

I have had a note to say that it is at constable level, but of course they are accountable to the PCC.

My Lords, I thank the Minister for her comprehensive reply. This has been a short but very focused debate and full of extraordinary experience from around the House. I am extremely grateful to noble Lords for coming and contributing to this debate in the expert way they have.

Some phrases rest in the mind. The noble Lord, Lord Alton, talked about live facial recognition being the tactic of authoritarian regimes, and there are several unanswered questions about Hikvision in particular that he has raised. The noble Lord, Lord Anderson, talked about the police needing democratic licence to operate, which was also the thrust of what the noble Lord, Lord Rosser, has been raising. It was also very telling that the noble Lord, Lord Anderson, said the IPA code was much more comprehensive than this code. That is somewhat extraordinary, given the subject matter of the IPA code. The mantra of not stifling innovation seems to cut across every form of government regulation at the moment. The fact is that, quite often, certainty in regulation can actually boost innovation— I think that is completely lost on this Government.

The noble Baroness, Lady Falkner, talked about human rights being in a parlous state, and I appreciated her remarks—both in a personal capacity and as chair of the Equality and Human Rights Commission—about the public sector equality duty and what is required, and the fact that human rights need to be embedded in the regulation of live facial recognition.

Of course, not all speakers would go as far as I would in asking for a moratorium while we have a review. However, all speakers would go as far as I go in requiring a review. I thought the adumbration by the noble Lord, Lord Rosser, of the elements of a review of that kind was extremely useful.

The Minister spent some time extolling the technology —its accuracy and freedom from bias and so on—but in a sense that is a secondary issue. Of course it is important, but the underpinning of this by a proper legal framework is crucial. Telling us all to wait until we see the College of Policing guidance does not really seem satisfactory. The aspect underlying everything we have all said is that this is piecemeal—it is a patchwork of legislation. You take a little bit from equalities legislation, a little bit from the Data Protection Act, a little bit to come—we know not what—from the College of Policing guidance. None of that is satisfactory. Do we all just have to wait around until the next round of judicial review and the next case against the police demonstrate that the current framework is not adequate?

Of course I will not put this to a vote. This debate was to put down a marker—another marker. The Government cannot be in any doubt at all that there is considerable anxiety and concern about the use of this technology, but this seems to be the modus operandi of the Home Office: do the minimum as required by a court case, argue that it is entirely compliant when it is not and keep blundering on. This is obviously light relief for the Minister compared with the police Bill and the Nationality and Borders Bill, so I will not torture her any further. However, I hope she takes this back to the Home Office and that we come up with a much more satisfactory framework than we have currently.

Motion withdrawn.

House adjourned at 9.30 pm.