[Sir Roger Gale in the Chair]
I beg to move,
That this House has considered facial recognition and the biometrics strategy.
It is a pleasure to serve under your chairmanship, Sir Roger. First, I must declare my interests, which are not directly in the subject but in the privacy and data protection space in which I practise as a lawyer, as set out in the Register of Members’ Financial Interests. I chair various technology all-party parliamentary groups and Labour Digital. I am also a member of the Science and Technology Committee, which has an ongoing inquiry into the subject. We have taken evidence from Professor Paul Wiles, the Biometrics Commissioner, and Baroness Williams of Trafford, the Minister in the other place. Some hon. Members have sent their apologies, which I entirely understand, because we are competing with the climate change debate in the main Chamber.
Why did the subject first come to my attention? As a consumer, I have become increasingly used to using facial recognition technology, whether I have proactively agreed to it or not. I often forget my passwords these days, because I use my face to pay for things and open my iPad and phone, although as I was saying to my hon. Friend the Member for Sheffield, Heeley (Louise Haigh), that can prove tricky when I am trying to pay for things at a distance. For many of us, facial recognition technology provides consumer services on Facebook and Google by auto-tagging friends and family members and allowing us to search our images. There is an entire debate to be had about consent, transparency and privacy in the use of such technologies in the private sector, but my focus today is on the role of the state, the police and the security services in the use of facial recognition technology.
Facial recognition technology is beginning to be used more widely. It is well known to those who take an interest in it that the South Wales police has used it at sporting events; that the Metropolitan police has trialled auto-facial recognition technology on many occasions, including at events such as the Notting Hill carnival and Remembrance Sunday, and at transport hubs such as Stratford; and that Leicestershire police has used it at the Download music festival. I am concerned that it is also being used at public protests, although perhaps I understand why; I will come on to that later in relation to our freedom of association.
I congratulate my hon. Friend on securing this debate on a key subject. He has spoken light-heartedly about the competition with the climate change debate. Does he agree that in some ways, as with climate change, although only a small number of issues are currently associated with this topic, the range of impacts that facial recognition technology will have on our society and economy, on the way we work and do business, and on our trust relationships will be huge and will grow over time?
I agree wholeheartedly with my hon. Friend. She and I often end up in these types of debates in this place. One thing that they have in common is that the technology is changing and the services are becoming more mature at such a pace that the regulation and concerns are often slower. As legislators, we need to understand the technology as well as we can and make sure that the appropriate protections are in place.
In other spaces, we talk about the fact that I have a date of birth, I am male, I have two daughters and I am a vegan, which means that companies profile me and suggest that I might like to buy Quorn sausages that children like. There is a public debate about that, of course, but facial recognition technology is a particularly sensitive area of personal data. Such technology can be used without individuals really knowing it is happening, as I will come on to shortly, which is a big issue. It is not just police forces that are interested in the technology; some councils are using it to enforce certain rules, as is the private sector, as I say.
Facial recognition technology uses two methods: live auto-facial recognition, which is referred to as AFR Locate, and non-live auto-facial recognition, which is referred to as AFR Identify. What does Locate do? When such technologies are being trialled—although some police forces have been trialling such technologies for many years, so the definition of trial is important—cameras will build a biometric map of the face and facial features of members of the public who are walking down the high street, through a shopping centre or at a sporting or music event. That builds a numerical code that identifies them as individuals, which is matched against a database of images to identify them using the technology. That spurs an action by the police force or others, should they feel that that individual is high risk or has broken the law and some enforcement needs to be taken against them.
As I have alluded to, unlike fingerprints, which people have to proactively give, the technology is so pervasive that many people will walk past the cameras not really knowing that they are taking part in the process and, therefore, not consenting to it. As I will come on to shortly, the rules in place for the use of facial recognition technology are non-existent.
On non-live AFR, the so-called Identify scheme, I will focus on the databases that are being used. After we have built the facial image—the map or code of a person’s face—we match it against a database of images. What is that database and where do those images come from? The police have watch lists of people they are concerned about. Obviously, we want terror suspects to be on a watch list so that the police can do their job properly. There has been a question about scraping social media for images that police forces can match against. Can the Minister confirm that today? If we are doing that in an untargeted fashion for those about whom there are legitimate concerns, we ought not to be. There are also custody images on databases such as the police national database, about which there are long-running concerns, as we have heard on my Select Committee. When the police take someone’s picture and put it on to the PND, it stays there. It does not matter whether they are convicted and go on to a list of people with convictions—perhaps we would understand if that were the case—or they are found innocent or no action is taken against them; their images are kept on the database anyway.
We have known for many years that the way the police have been processing the facial images of innocent citizens is unlawful. In the High Court in 2012, in the case of RMC and FJ v. Commissioner of Police of the Metropolis, the High Court was clear that it was being managed unlawfully. The Home Office responded, albeit some years later—I am not entirely sure why it took so long to respond to such an important issue—setting out a six-year review period in which the police would have to review the images on the database to decide whether they should weed and take out the images of innocent citizens. It also said that any of us could proactively ask the police force to remove our images because we claim our innocence.
There are several problems with that. Unsurprisingly, the number of requests to remove facial images from the database has been low, because people do not know about it. The fact that people have to proactively prove their innocence to not be on a police database is also fundamentally an issue. It is well known, however: the minutes from the September meeting of the Law Enforcement Facial Images and New Biometrics Oversight and Advisory Board say that
“most forces were struggling to comply”
with the Government’s response to the High Court’s ruling of unlawfulness. In answer to my questions in the Select Committee hearing, the Minister in the other place and her officials confirmed that no additional resource had been given to police forces to respond to or promote the fact that people can request the removal of their images from the database, or to undertake the review in which they are supposed to weed out and delete the images that they are not keeping on the database.
Evidently, the system is not fit for purpose. In my view, we continue to act in a way that the High Court said was unlawful, and I know that the Information Commissioner has also expressed concern. It will be useful if the Minister sets out how the Government will act to improve the situation, not only in terms of resourcing and support for police forces across the country but in terms of honouring the Government’s commitment to build new databases, so that the process can be automatic. Technology is pretty advanced now in some of these areas of facial recognition. If Facebook is able to identify me, tag me and take an action, and if Google is able to identify me and allow me to search for myself online, surely the Government ought to be able to auto-scan images, identify people who are not criminals and automatically delete the images of them. That ought to be deliverable. Indeed, it was our understanding that such a system was being delivered, but only a few weeks ago, when I asked Baroness Williams, the Minister in the House of Lords with responsibility for this issue, when we could expect the new computer system to be delivered, there was stony silence from the Minister and her officials. They were not clear when it was going to be delivered, why it had been indefinitely delayed and whether the delay was due to financing, contractual issues or technology issues. There was no clarity about how the existing system would be fixed.
We found in 2012 that the system was unlawful in relation to civil liberties. That in 2019 going into 2020, we do not know what we are doing to fix it or how it will be fixed, it is wholly unsatisfactory. Will the Minister give us a clearer update today about when the automatic deletion service will be available to police forces?
I thank my hon. Friend for giving way to me again. He has made some very important points about the way in which this technology is already being used by Facebook and others, but is it not the case that, however advanced the technology is, it has also been found that it can be biased because of the training data that has been used, which means that particularly those from minorities or specific groups are not recognised adequately? Does he agree that it is all the more important that there is investment as well as transparency in the police database, so that we can ensure that groups who are already marginalised in many ways, particularly with regard to police services, are not once again being discriminated against?
Unsurprisingly, I agree entirely. This is part of a much broader conversation about designing technology with ethics at the very start, not only in facial recognition but in algorithmic decision making and a host of different areas where we have seen that human biases have been hardwired into automated decision processes that are delivered through technological solutions.
The Government have a really important role to play here, not just in setting the regulatory framework and building on, and really giving strength and resource to, the Centre for Data Ethics and Innovation to set the national and international tone, but through their procurement of services. They must say, “We have got to get this technology right. We are going to buy these systems, but we really must see this ethics by design right from the very beginning, dealing with biases in a way that allows us to avoid biased solutions.” That would stimulate the market to ensure that it delivered on that basis.
On the legal basis for biometrics, older forms of biometrics such as DNA and fingerprints have a legal framework around them; they have guidance and rules about how they can be used, stored and processed. There is no specific law relating to facial recognition and no specific policy from the Home Office on it. The police forces that are trialling these systems say that they are using existing legislation to give them the legal basis on which to perform those trials, but the fact of the matter is that we only need to look at the dates of that legislation to see that those laws were put in place way before the technology came into existence or before it reached the maturity that we are seeing today.
There was some debate during the passage of the Data Protection Act 2018, when I, my hon. Friend the Member for Sheffield, Heeley and others served on the Committee scrutinising that Bill, but there was no specific discussion during that process or any specific regulation arising from it about facial recognition technology. If police are relying on the Police and Criminal Evidence Act 1984—perhaps there is an irony in the date of that legislation—the basis and the understanding of the technology did not exist at that time, so it is not in that legislation. Even the Protection of Freedoms Act 2012 is too old. The definition of biometrics in that legislation cannot encapsulate a proper understanding of the use, sensitivity and application of automatic facial recognition.
I am not alone in saying this—indeed, it seems to be the view of everybody but the Government. The Information Commissioner has opened investigations; the independent biometrics and forensics ethics group for facial recognition, which advises the Home Office, agrees with me; the London Policing Ethics Panel agrees with me; the independent Biometrics Commissioner agrees with me; and, perhaps unsurprisingly, civil liberties groups such as Liberty and Big Brother Watch not only agree with me but are involved in legal action against various police forces to challenge the legal basis on which these biometrics trials are being conducted. When he response, will the Minister say that the Government now agree with everybody else, or that they continue to disagree with everybody else and think that this situation is okay?
I will now address the second part of this debate, which is the biometrics strategy. I focused on facial recognition because it is a particularly timely and sensitive component of a broader biometrics strategy. All of us who use technology in our daily lives know that biometric markers and data can be used to identify our location, identity and communications. That means that the Government and, indeed, the private sector can access data and learn things about us, and that area of technology is growing. People are rightly concerned about ensuring that the right checks and balances are in place. It is one thing for an individual to agree to facial recognition technology in order to unlock their tablet or phone, having read, I hope, about what happens to their data. It is another thing, however, for them not to be given the opportunity to give their consent, or not to receive a service and therefore not know about it, when the state is using the same types of technology.
The biometrics strategy needs to get into the detail. It needs to set out not only what is happening now but what is envisaged will happen in the future and what the Government plan to do about it, in order to protect civil liberties and inform citizens about how the data is being used. Clearly, they would not be informed individually—there is no point in telling a terrorist planning an incident that there will be a camera—but the right balance can be achieved.
Again, I do not understand why the Government are so slow in responding to these fundamental issues. It is so long since the 2012 High Court ruling on the retention of custody images, and we had to wait five years for the biometrics strategy. Imagine how much the biometrics sector in this country changed during those five years. Perhaps the Government were trying to keep up with the pace of change in the technology space, but the strategy was long delayed and long awaited.
Given my tone, Sir Roger, you will not be surprised to hear that everyone was very disappointed with the biometrics strategy, because it merely gave a kind of literature review of current uses of biometric data. There was a little bit about the plans for a new platform, which the Home Office is building, regarding how different people access biometric data. It said nothing at all, however, about the future use, collection and storage of biometric data, or about data protection. It said nothing about the Government’s own use and collection of data; the need for enforceable guidelines to enable devolved decision making by, for instance, police forces across the country; how different Departments might be able to use different forms of biometric data across Government, which, evidently, is very easy to deliver with today’s technology; or how the data would be stored securely.
People are concerned about cyber-security and breaches of their personal data, so what steps will the Government take in this developing space? Where will the data be stored? In advance of this debate, I received representations arguing that we should not send it to companies overseas and that it should be stored in the UK. One would think that the biometrics strategy addressed those issues, but it does not. Is the beta version of the biometrics strategy due soon, or does the Minister think that the Government have provided a sufficient response on this important field?
I do not want to keep saying that everybody agrees with me, because that would be a little uncomfortable, but there is no denying that the Biometrics Commissioner, the Surveillance Camera Commissioner and the Information Commissioner’s Office have all said exactly the same thing—this biometrics strategy is not fit for purpose and needs to be done again. The Government need to be clearer and more transparent about their endeavours and make that clear to the public, not least because these areas of technology move at pace. I understand entirely why police forces, civil servants or others want to be able to take the opportunities to deliver their services more efficiently, more effectively and with more impact—we support that—but the right checks and balances must be in place.
I will touch on our fundamental rights and freedoms, because that debate does not get enough air time in the technology space. Our freedoms are increasingly being challenged, whether the issue is cyber-defence or how we regulate the online world, and also in this space. Fundamental freedoms—freedoms that we hold, or purport to hold, dear—are encapsulated in the European convention on human rights and the Human Rights Act 1998. They go to the very nature of this technology, such as the right to a private life that can only be interfered with for a legitimate aim and only if that interference is done proportionately. Scanning a load of people going about their day-to-day life does not feel proportionate to me, and there is no accountability to make sure that it is being done legitimately. As my hon. Friend the Member for Newcastle upon Tyne Central (Chi Onwurah) said, if the selections that those technologies pick up are resulting in false matches or are discriminating, primarily against women and people from ethnic minority backgrounds, that also ought to be considered.
Those freedoms also include freedom of expression and of association. In public protests in recent weeks, people who dearly hold certain views have gone too far by moving away from their right to freedom of expression and to peaceful demonstration, towards criminal activity, intimidation or hostility. We should set the tone and say that that is not welcome or acceptable in our country, because having a right also means having a responsibility to use it wisely. Of course we want to protect those who want to demonstrate through peaceful public protests.
I am sure the public will say—this lies at the heart of my contribution—“Fine. Use some of this technology to keep us safe, but what is the right balance? Do we understand how it is being used? What are the accountability measures? What rules and guidance are being put down by the Government, on behalf of Parliament and the people, to make sure this is being done in a way that is not a slippery slope towards something we ought not to be doing?” We need a wider debate in public life about how we protect freedoms in this new digital age, and this issue is an example of that.
The House of Commons digital engagement programme is often a very good process for Westminster Hall debates, as it allows the public to be part of the conversation and to submit their comments. It would be remiss of me to not point out that some members of the public highlighted a certain irony in the fact that this debate was being promoted on Facebook, so I have shared their concerns, but that is still a medium through which the public like to engage in debate. Hundreds of thousands of people engaged across different platforms—way more than I was expecting—which shows the level of public interest in the use of these technologies.
As might be expected, there were two sides to the argument. The minority view on the platforms was, “I have nothing to hide. Please go out and keep us safe. Crack on, use it.” The other side said, “Actually, this is a slippery slope. I don’t know how this is used, and I’m worried about it. Why can’t I go about my day-to-day life without the police or the state surveilling me?”
I will share some of the comments. On the first side of the argument was Roy. I do not know where he is from. I wish his location had been given, because I could have said, “Roy from Sheffield”. He said:
“No objection. I’ve nothing to hide and don’t find it scary or objectionable for ‘the state’ to be able to track my movements. They already can if I’m in a car”—
I did not know that—
“and that doesn’t seem to be a problem. The added security of the police being able to track potential terrorists far outweighs any quibbles about reduced privacy.”
That is a perfectly legitimate view.
“Having seen the numbers of crimes solved and even prevented by CCTV I have no objections. Today we have to be realistic, with phones listening in on conversations for marketing and plotting where we are, this is small price to pay for public safety and if you have done nothing there is nothing to fear.”
That is an interesting contribution on what is happening in the private and state sectors. We need to be much more advanced in both spheres.
That was a minority view, however. I do not have the percentage, but the bulk of comments came from people who are concerned. Chris Wylie, who many of us will have read about—he was the Cambridge Analytica whistle- blower, so he clearly knows something about these issues —was firm:
“No. Normalising this kind of indiscriminate surveillance undermines the presumption of innocence.”
We should pause on that, because it is really important. Why should we be tracked and surveilled by the police on the assumption that we might be guilty of something? That does not feel right, just as it does not feel right that people have to prove their innocence to get their images taken off a police database. Chris went on to say:
“It should never be up to us as citizens to prove we are not criminals. Police should only interfere with our lives where they have a reasonable suspicion and just cause to do so.”
I share Chris’s views.
Andrea said that this was a slippery slope:
“The idea that some people have about privacy as an exclusive issue for the bad guys is completely wrong. Not only privacy prevents my acts from limiting my rights but also avoids an unjustified use of power by the Gov’t.”
Again, we should pause there. It is our job in Parliament to hold the Government to account, yet we have no strategy, legislation or rules to enable us to do so. That is a fundamental problem. She goes on to say:
“Such a huge involvement of disturbing tech could lead to a 1984-like slippery slope, one which none of us wants to fall in, regardless of their legal background.”
“I believe that this would suppress people’s ability to engage in public demonstrations and activities that challenge the government, which is hugely dangerous to democracy.”
A lot of people said that if they thought the state was scanning their data and putting it on a database, they might not associate with or take part in public demonstrations. If that were to happen, it would represent a significant diminution of our democratic processes.
Lastly, Bob said:
“It makes it easier for a future, less liberal government to monitor the activity of dissident citizens. During the miners strike in the 1980s miners were stopped from travelling just on the suspicion they would attend rallies based on their home locations and where they were heading. How would this technology be applied in the future for, say, an extinction rebellion march?”
Regardless of our political disagreements across the House, none of us thinks that the state is overreaching in a way that many other countries would. However, given the lack of legislation, guidance and regulation to enable us to hold the Government to account, and with independent commissioners and regulators saying that this is not good enough, I agree with Bob. There is a huge risk in not putting in place a framework with the appropriate checks, balances and protections, not just because that is the right and important thing to do today, but because we need that framework for future Governments.
My hon. Friend is being very generous with his time, and I congratulate him again on having raised this important topic. Does he agree, as I think he is suggesting, that the level of interest in this debate—demonstrated by the quotes he has read out—shows that technology such as facial recognition, as well as algorithms and data, needs to be publicly debated? We can make a choice as to how it is used, so that citizens are empowered. Technology should not be something that is done to people; they should have rights and controls as to how it is enacted.
My hon. Friend is absolutely right. The debate is a broader one about technology. How do we engage the public with these issues? I am an evangelist for technological reform, although I will not go on about that topic for too long, because it is not linked to the title of the debate. In my view, the idea that we can increase our economy’s productivity, increase wages, transform people’s working lives and reform and make more efficient our public services without using technology does not make sense. As my hon. Friend says, however, we have to do that in the right way and bring the public with us.
On a cross-party basis, we share the belief that we need to take crime seriously, and to address the increasingly sophisticated methods that criminals and terrorists may employ when trying to commit crimes or terror in our country. However, we must get the balance right, and there is a lacuna of regulation in this space. There are no legal bases, there is no oversight, and as a consequence there are no protections. That is why the Government should act now.
I congratulate the hon. Member for Bristol North West (Darren Jones) on presenting the case very well. We spoke before the debate started and found we were on the same page. I am pleased to see the Minister in his place. Our comments are made with a genuine interest in arguing the case and hopefully to help the Government see the correct way of moving forward. I also want to thank the hon. Member for Bristol North West for the hard work that he and other members of the Science and Technology Committee undertake. It is painstaking work—very technical and detailed. If I was wearing a hat I would take my hat off to them for what they have done.
The issue of facial recognition is a complex matter. Of course, anyone who watches American crime dramas—I am one of those people who watches CSI and all the others from about 11 o’clock until 12 midnight before going to bed—will think it is a useful tool for identifying terrorist suspects, which can be right, but Hollywood life and real life are two very different things, and black and white is difficult to have when we consider people’s right to privacy and how far we can have a security site without a Big Brother state. I am always very conscious of that Big Brother state.
I thank the Library for the background information. I read in the paper this morning of a suspect in China who was wanted in relation to the murder of his mother. He had been missing for two to three years, but facial recognition was installed at the airport and they caught him. That is one of the good things that can happen—those who thought they would get away with a crime are made accountable.
I declare an interest as the chair of the all-party group for international freedom of religion or belief. As hon. Members know, I am very interested in such issues. China has apprehended a fugitive and is making him accountable for his crime, but at the same time China uses facial recognition to fence in villagers in the far west of China. That is a very clear illustration of how that technology can be used to the detriment of human rights and religious minorities in a country that, let us be honest, is guilty of many human rights abuses. I am very concerned at how China can use facial recognition to its advantage to suppress human rights and to suppress people’s right to practise their religion in the way that they would like.
On accuracy and bias, some of the information illustrates clearly that errors for low-resolution surveillance footage can exceed 10%, so there is still a question mark over the validity of the process. If as many as 10% of people are found not to be the right person, I question the validity of the process.
We cannot deny or ignore the concerns of Elizabeth Denham, the Information Commissioner. She raised concerns about facial recognition technology and law enforcement in her blog:
“There is a lack of transparency about its use and there is a real risk that the public safety benefits derived from the use of FRT will not be gained if public trust is not addressed.”
I refer to the questions that the hon. Member for Bristol North West asked, and I suspect others will, in relation to the need for public trust that the system will not be abused or used detrimentally against people. People feel strongly about this matter. How does the use of FRT comply with the law? How effective and accurate is the technology? How do forces guard against bias? What protections are there for people that are of no interest to the police? How do the systems guard against false positives and their negative impact? That is clearly an issue.
My hon. Friend the Member for South Antrim (Paul Girvan) tabled a parliamentary question on 24 May 2018—
“To ask the Secretary of State for the Home Department if he will take steps to ensure that the facial recognition software that law enforcement bodies use is accurate.”
It clearly tells us that there are concerns across all four regions of the United Kingdom—England, Scotland, Northern Ireland and, obviously, Wales.
The background is clear. The courts ruled in the 2012 RMC case that it was unlawful to hold custody images without making a distinction between those who were convicted and those who were not. In response, the Home Office has introduced a system to allow unconvicted individuals to request the deletion of their images. We understand the system and that is all great, but it is an opt-out scenario; the individual must ask for the deletion of their image. I am not sure how many people would think of doing so; I suspect it would be the last thing on many people’s mind, with their busy lives. I know I probably would not think of doing so. I would not know that my images have been stored away for a rainy day to be pulled out, even though I am completely innocent. The presumption, “You may well do something someday” is not enough of a reason to hold on to these things. An arrest must be made for fingerprints to be taken and stored, and yet no arrest is needed for images of a person in the background of an event to be taken and perpetually stored by successive Governments—not just this Government, but every Government that comes after, if the legislation is in place.
The excuse of cost is a weighty consideration, and so is the protection of personal identification. I say this honestly: because of my age I have lived through the height of the troubles, when cars were searched, ID was a must and the battle against terrorists was daily. I lived with that, not just as an elected representative, but as a former member of the part-time army—the Territorials and the Ulster Defence Regiment. We seem to be heading that way again. I could understand it if the Government were to make it known that they believed that retaining this process would save lives—I would understand the thinking behind what they are trying to do—but that if necessary, there would be a mechanism to have the information removed. I could understand it if there was that level of transparency. However, to say that the reason is that there is not enough money to do an IT upgrade just does not wash with me, and I suspect it does not wash with others taking part in today’s debate.
I agree with the Science and Technology Committee report, “Biometrics strategy and forensic services”, published on 25 May 2018, which states:
“The Government must ensure that its planned IT upgrade…is delivered without delay…to introduce a fully automatic image deletion system for those who are not convicted. If there is any delay in introducing such a system, the Government should move to introduce a manually-processed comprehensive deletion system as a matter of urgency.”
That would be my request to the Minister. We have great respect for the Minister; that goes without saying, but we are very concerned about what is proposed, or could possibly happen, and we have to record those concerns.
I further agree that facial image recognition provides a powerful evolving technology, which could significantly help policing. I am all for that, but everyone must be clear about what is acceptable, what is not acceptable, what is held and for what purpose. That underlines my point that if it is for the sake of security, then say it and we can debate the merits of the case. If that is the purpose, let us debate it honestly, truthfully and in an informed way, to ensure that all concerns are taken on board.
I am all for targeting those on a watchlist or those affiliated with those on a watchlist, as in previous examples of terrorism on the mainland and back home as well, but let us be clear that it is happening, and let us be clear that those who take umbrage against it have the information that they need to ensure that their images are not stored even though they have not committed a crime and are not a person of interest. I am conscious of the need to protect human rights, protect privacy and protect those who are innocent, although I understand the reasons for the process.
In conclusion, I look to you, Minister, as I always do. We must have a chance to debate these issues and make an informed decision about the strategy and the justification for it. I look forward to the report’s coming before us, but I must record my concerns.
It is a pleasure to participate in this debate under your chairmanship, Sir Roger. I congratulate my hon. Friend the Member for Bristol North West (Darren Jones) on securing it. My speech will be neither as lengthy nor as expert as his. My interest in this matter arises from the issue in my constituency last year when a report in the Manchester Evening News revealed that the intu Trafford Centre had been working with Greater Manchester police to use live facial recognition technology. I had not been made aware of that previously, and as far as I know, none of my constituents, or the other members of the public, knew of it either. Following the report in the Manchester Evening News, the intu Trafford Centre and Greater Manchester police suspended the pilot.
Like my colleagues, I suspect that many of our constituents would support the use of facial recognition and other technologies to prevent crime, keep us safe, catch criminals or trace missing and vulnerable people, which is something that I understand the British Transport police are considering. However, as we have heard, the use of the technology raises a number of issues, which my hon. Friend the Member for Bristol North West drew attention to. I have discussed some of them directly with local police in Greater Manchester, and at national level. I am grateful to the police officers who have spoken to me for their openness in those discussions. It is clear that the police themselves feel that there is a pressing need for the national framework that would make effective use of the technology possible. For now, they do not feel they have that.
From my perspective, and in the light of the experience in my constituency, I think that the framework will need to address decision making, who takes a decision to use such technology in a particular context, oversight and, importantly, accountability. How can such use be scrutinised and how can the police and other state authorities be made accountable to the public? I say that because what is happening could constitute a significant intrusion into the privacy of individual citizens whose record contains nothing criminal or threatening, and who are merely going about their daily business. It is important that the use of the technology in relation to the majority of citizens should be both appropriate and proportionate.
Issues that concern me include the size and content of any watchlist that might be constructed—particularly vis-à-vis the effectiveness of the size of the watchlist. In the Manchester Evening News report it was revealed that 30 million people per annum visit the intu Trafford Centre. It is an iconic destination in my constituency. However, over the sixth-month period during which the technology was being deployed, only one positive identification was made. That makes me question whether it was right to draw so many members of the public into the ambit of the experiment, for what seems to be a low level of effectiveness.
We also have to consider where the technology is being used. The police themselves said to me that some events or venues will be more suitable, and some less. Also we need to consider why it is used—at whose initiative or request such technology is deployed. In the Trafford Centre the intu management themselves had suggested it to Greater Manchester police. Is it right that police priorities should be set by the wishes of private enterprises? If that can be right, and in some circumstances there can be a partnership approach between the police and private entities, if the private entity draws a disproportionate benefit from the activity is it not right that it should pay for it? Football clubs pay for additional police protection at football matches.
We have heard concerns about potential ethnic bias in the databases and technologies that are currently available. I am told that what is on the market, as it were, at the moment is better at matching north European and south-east Asian males than other ethnic categories. That relates to the countries in which the algorithms that underpin the technology were developed, but from the public’s point of view we can say that if there is any ethnic disparity, or perception of it, in the way the technologies apply, it is bound to sow public mistrust. It cannot be right that we make use of technologies that do not treat all communities equally.
I have mentioned my concerns about where decisions are taken in police and other public agencies. It has been made clear by regulators that that should be at the most senior strategic level, and in my view it should be in the context of an absolutely transparent national framework. I also think we must think about mechanisms for accountability both to individual members of the public and the community that a police force serves overall.
Finally, while we are not going to halt the speed of spread of technology, and I think that we can expect more resources to go into such technology in the future, there is a question about how we prioritise resources vis-à-vis effectiveness and public buy-in. The static facial recognition technologies that have been used have excited much less contention and public concern. People can understand that the police hold a database of those with previous convictions and criminal records, and that they will check, where they have got someone whom they are not able to identify, against those records. I understand that that database is in need of new investment, which it is not currently scheduled to receive. I ask the Minister whether that might be the first priority for investment in facial recognition technologies; can the investment that is needed in the police national database be brought forward?
I am glad we have had the opportunity to debate the matter in Parliament today. I would be misleading the House if I suggested that it was causing widespread concern among my constituents, but in fact it should be. How the technology is being used, and the context in which we are made aware of its use, should concern us all. That is not to say it should not be used, but in the absence of a clear legislative or regulatory framework for its use, I do not think it would be right for the House not to ask those questions today.
It is a pleasure to serve under your chairmanship, Sir Roger. I thank the hon. Member for Bristol North West (Darren Jones) for obtaining the debate. I can testify to his expertise on such issues, having served with him on the Committee that scrutinised the Data Protection Act 2018. I claim no such expertise, so I am grateful to him for succinctly explaining the operation of facial recognition technology in particular. It has been a useful debate. It is a shame that we have clashed with the climate change debate because, as the hon. Member for Stretford and Urmston (Kate Green) said, even if the issue does not cause concern among many of our constituents at the moment, it ought to. There are some important questions that we have to debate and address.
The use of biometrics by police and law enforcement is of course not remotely new, but it is clearly evolving exponentially. It can and does make a huge contribution to detecting and preventing crime; it also has an important role in eliminating innocent individuals from inquiries, and it can help to trace missing and vulnerable people, but as all the hon. Members who spoke highlighted, it poses a range of serious ethical and human rights questions. It has the potential to be hugely invasive of privacy, largely because of the possibility that the systems will operate while requests for the consent of those caught up in them will be limited; there could be impacts on freedom of assembly and association, and the operation of the systems raises significant questions about data protection. In many forms of fast-developing technology, it is a challenge for the legal system to keep pace with changing use. Understandably, there has been particular concern about automatic facial recognition technology.
All the different legal systems in the United Kingdom and beyond face those challenges, and of course Scotland is no different. We kick-started our debate on the issues in 2016 with the report of Her Majesty’s inspectorate of constabulary in Scotland. It concluded that Police Scotland had been making
“proportionate and necessary use of Facial Search within PND”
and that it had been operating in accordance with its own policy, Home Office policy, and College of Policing guidance. However, it identified similar concerns to those that have been raised in the debate, and the need for improved legislation, a statutory code of conduct to govern Police Scotland’s use of biometric data, and better independent oversight.
The main legislation relating to biometrics in Scotland dates from 2010. The hon. Member for Bristol North West mentioned 2012 legislation being out of date already, and I absolutely accept that the 2010 measure is now too old. It predates the time when Police Scotland started to upload photos on to the police national database, in 2011. I understand that the facial search functionality of PND became generally available in March 2014. We do indeed have some catching up to do to make sure that issues to do with images and facial recognition technology are properly covered in legislation.
Following the inspectorate report, the Independent Advisory Group on the Use of Biometric Data in Scotland was established to produce more detailed proposals for plugging some of the gaps and setting up a more ethical and human rights-based framework. I thoroughly recommend the group’s report—it is a fascinating read. It draws on a range of expertise, not just from members of the group, but from the police, human rights and data protection groups, and experts such as the Biometrics Commissioner, the Forensic Science Regulator and the Biometrics and Forensics Ethics Group, which advises the Home Office. The report found that
“those involved in this field in Police Scotland...appear to work to very high standards of international repute, with a good grasp of the ethical and human rights implications of their work”.
It also made several recommendations about enhancing the legislative framework and oversight. Specifically, it recommended a Scottish biometrics commissioner and an ethics advisory group. It recommended a new legislative framework, accompanied by a code of practice, and made more detailed policy recommendations that I will come to shortly. I am pleased that those recommendations have been accepted by the Scottish Government. A public consultation has been held, and a biometric data Bill will soon be introduced to implement them. That is the right approach, and hopefully it will deliver the comprehensive framework that hon. Members have argued for today.
Let me turn to two of the most controversial aspects of the debate. In Scotland, 2010 legislation allows Police Scotland to retain fingerprints and DNA data from convicted individuals indefinitely. Data from individuals prosecuted for certain sexual and violent offences may be retained for three years, regardless of whether there is a conviction, and the chief constable can apply to the sheriff court for a two-year extension. More generally, data from individuals who have been arrested for an offence must be destroyed if they are not convicted or if they are granted an absolute discharge. Usual practice for photographs also follows that regime, which is slightly different from what happens in England and Wales, particularly with regard to the disposal of photographs of those who have not been charged or convicted.
Is that the perfect approach? I do not think we can answer that conclusively; we must be led by the evidence as it develops. It is perfectly legitimate to question whether a blanket policy of the indefinite retention of biometrics after every conviction is reasonable, because, as the advisory group pointed out, there is no abundance of evidence to suggest what degree of retention has proved the most useful. Biometric data is likely be more useful in identifying the perpetrators of some crimes compared with others, but the risk of offending and reoffending involves a range of factors, including many individual aspects. In an ideal world, the length of time we kept biometric data would be decided for each individual case, but that is not a practical approach, so we must consider the evidence gathered and do the best we can.
The use of automated facial recognition systems is hugely problematic, and our general approach must be evidence led. If such technology is to be used, it must be used only where necessary, and it must be done proportionately and on as targeted and limited a basis as possible. There are huge concerns about the impact of such technology on privacy and freedoms such as the freedom of assembly, and there is a danger of bias and discrimination. Studies have shown that such technology can disproportionally misidentify women and black and minority ethnic people, and as a consequence people from those groups are more likely to be wrongly stopped and questioned.
We must by now have sufficient evidence from forces in London and south Wales to show what automated recognition could look like in practice, what it is capable of achieving, and the price to be paid for that achievement. I will not say that I envisage no circumstances where the use of such technology could be justified—for example, it could be used to counter a specific and serious threat or danger—and I am probably somewhere between Roy and Chris in the range of views set out earlier. Nevertheless, I would be reluctant to see such technology rolled out in Scotland before the new regulatory and oversight regime is in force and before issues of bias and discrimination have been addressed. It seems sensible to stop the use of the technology elsewhere until its implications have been fully assessed and debated, sufficient checks are in place, and there is sufficient public support.
I will end with a quote from the advisory group:
“In this context, it is essential that sensitive personal data are collected only for specific, explicit, lawful and legitimate purposes. In seeking to achieve a careful balance between the needs of citizen and state, there is clearly a need for independent oversight, and for the development of a broad framework of consistent ethical and human rights respecting principles against which all biometric use for policing, law enforcement and public protection purposes in Scotland can ultimately be checked”.
The SNP supports an approach that involves a comprehensive legislative framework and a regularly updated code of conduct. We need strong oversight through a commissioner to ensure that the use of biometrics is proportionate, necessary and targeted, and respects human rights, privacy and data protection rules. I congratulate the hon. Member for Bristol North West on securing this debate. I hope there will be many more to come, with more MPs in attendance, as this important subject requires much more discussion.
It is a pleasure to serve under your chairmanship, Sir Roger. This excellent discussion has been informed by expert opinion, particularly from my hon. Friend the Member for Bristol North West (Darren Jones), whom I congratulate on securing this important debate. I think the public would be shocked to hear about the lack of legislative framework and guidance, and the potential for such intrusion into people’s lives by the state.
My hon. Friend spoke about the need for us all to understand the technology that could be used, and to ensure that the frameworks we set out are relevant and keep pace with legislation. That must be informed by a principles-based framework, because legislation will never keep up with the technology used by law enforcement or private operators. Several Members mentioned the police national database and the unlawful processing of custody images by police forces. That is not a good starting point for this debate, given that the Home Office’s response to that issue has been poor and the delays in the auto-deletion of images are worrying.
My hon. Friend mentioned the need for ethics by design to ensure that any biases, particularly against people from BME backgrounds, are built out of such technologies from the beginning and are not allowed to be replicated and harden. He described well the astonishing fact that there is no legal basis for these invasive, pervasive technologies and highlighted clear gaps in the biometric strategy in failing to address those issues. The hon. Member for Strangford (Jim Shannon) spoke powerfully about the consequences of false positives, and raised basic questions about the rights of innocent people. Those questions should be answered. We should not need to hold this debate to speak about the right of innocent people not to have their privacy undermined, and about the police unlawfully holding images of people who have committed no crime.
My hon. Friend the Member for Stretford and Urmston (Kate Green) spoke about her personal experience and the Trafford Centre in her constituency. She made the important point—I have had the same conversation—that the police want and need a transparent, national and consistent framework, because they feel that they have to make things up as they go along. Experiences will differ: South Wales police has demonstrated a completely different attitude to the Met in rolling out facial recognition, and it cannot be right for people to experience different technologies in completely different ways and with different attitudes, depending on the police force in the area where they live.
My hon. Friend is right to say that the police want a clear, national framework, and it cannot be right that different police forces operate in different ways. Greater Manchester police has stopped using that technology altogether, but there may be circumstances where we would like it to be deployed to keep us safe.
That is completely right, and that is why this debate and the framework are so important. We cannot allow the police, with all the best intentions, to attempt to use this technology and then in some cases to mess it up—as they will—and have to roll it back. We want to ensure that the framework is in place so that the police can go ahead with confidence and the public have confidence. We must ensure that biases are designed out and that people accept the intrusion into their privacy and understand that such technology is being used proportionately and out of necessity. At the moment we cannot have confidence in that, which is why this debate is so important.
I thank my hon. Friend for giving way, not least because I spoke at great length today. I did not mention earlier that we took evidence in the Select Committee from the Biometrics Commissioner that trials should be conducted on the basis of rigorous scientific guidelines and processes. The problem is that if we let different police forces do different things in different ways, we do not get clear answers on how and in what circumstances the technology can best be used. We need guidelines not just for the regulatory purposes, but so that the trials can be done in the right way.
That is absolutely right. I do not get a strong impression that individual police forces are learning from each other either. In the case of the Met, the word “trial” has been used for the technology’s use at Notting Hill carnival. It has been trialled for three years in a row. When does a trial become a permanent fixture? I do not think that that can now be called a trial. My hon. Friend is absolutely right that if it is a trial, we should be gathering data, and they should be informing Parliament and the public and should be addressing the concerns around false positives and ethnic biases and whether it is being used proportionately. My hon. Friend the Member for Stretford and Urmston gave the astonishing figure that demonstrated the mismatch between the numbers of people who were covered by the facial recognition technology when just one individual was identified. That surely cannot be proportionate.
The question of technology within law enforcement gets to the heart of public consent for policing in this day and age, and the issues we have discussed today represent only the tip of the iceberg of potential privacy issues. So much of what defines an investigation today is data-driven. Data-driven policing and data-led investigations are transforming policing. It is already completely unrecognisable from when I was a special constable only 10 years ago. The police have the scope to access more of the intimate details of our personal lives than ever before.
The trialling of technology—including facial recognition and, as my hon. Friend the Member for Bristol North West mentioned, risk assessment algorithms—has not been adequately considered by Parliament and does not sit easily within the current legal framework, but it is having some phenomenal results that we should not ignore. The identification of images of child sexual abuse rely on hashing technology, which enables law enforcement and the Internet Watch Foundation to scrape hundreds of thousands of images off the internet each year.
This week, we have had the news on what is in essence compulsion for rape victims to hand over their mobile phones for what potentially amounts to an open-ended trawl of data and messages, without which there is little prospect of conviction. That high-profile debate has lifted the lid on the ethical questions that ubiquity of data and technological advances are having on law enforcement. Nascent technologies such as facial recognition are at the sharp end of this debate. They do not just represent challenges around collecting and storing of data; they also provide recommendations to law enforcement agencies to act, to stop and search and, potentially, to detain and arrest people.
As my hon. Friend the Member for Bristol North West said, we served on the Data Protection Bill Committee, where we discussed these matters briefly. We outlined our concerns about facial recognition, in particular the lack of oversight and regulatory architecture and the lack of operational transparency. I reiterate the call I made to the Home Secretary in May last year that Her Majesty’s inspectorate of constabulary launch a thematic review of the operational use of the technology and report back to the Home Office and to Parliament.
We believe such a report should cover six key areas: first, the process police forces should and do follow to put facial recognition tools in place; secondly, the operational use of the technology at force level, taking into account specific considerations around how data is retained and stored, regulated, monitored and overseen in practice, how it is deleted, and its effectiveness in achieving operational objectives; thirdly, the proportionality of the technology’s use to the problems it is seeking to solve; fourthly, the level and rank required for sign-off; fifthly, the engagement with the public and an explanation of the technology’s use; and sixthly, the use of technology by authorities and operators other than the police.
It is critical as operational technology such as this is rolled out that the public are kept informed, that they understand how and why it is being used and that they have confidence that it is effective. The Minister has the power to commission reports of this type from HMIC and it would be best placed to conduct such a report into the use of police technology of some public concern.
We have discussed concerns about the accuracy of facial recognition tools, particularly in relation to recognising women and people from BME backgrounds—that is quite a swathe of the population! We do not know whether this is because of bias coded into the software by programmers, or because of under-representation of people from BME backgrounds and women in the training datasets. Either way, the technology that the police are currently using in this country has not been tested against such biases. In the debate around consent, it is extremely worrying that potentially inaccurate tools could be used in certain communities and damage the relationship with and the trust in the police still further.
As I said, we had some debates on this issue in the Data Protection Bill Committee, where we attempted to strengthen the legislation on privacy impact assessments. It should be clear, and I do not believe that it is, that police forces should be required to consult the Information Commissioner and conduct a full PIA before using any facial recognition tools.
I am further worried that the responsibility for oversight is far from clear. As we have heard, software has been trialled by the Met, the South Wales police force and other police forces across the country, particularly in policing large events. In September last year, the Minister made it clear in response to a written question that there is no legislation regulating the use of CCTV cameras with facial recognition. The Protection of Freedoms Act 2012 introduced the regulation of overt public space surveillance cameras, and as a result the surveillance camera code of practice was issued by the Secretary of State in 2013. However, there is no reference to facial recognition in the Act, even though it provides the statutory basis for public space surveillance cameras. The Surveillance Camera Commissioner has noted that “clarity regarding regulatory responsibility” for such facial recognition software is “an emerging issue”. We need clarity on whether it is the Biometric Commissioner, the Information Commissioner or the Surveillance Camera Commissioner who has ultimate responsibility for this use of technology. It would also be helpful if the Minister made absolutely clear what databases law enforcement agencies are matching faces against, what purposes the technology can and cannot be used for, what images are captured and stored, who can access those images and how long they are stored for.
The Government’s new biometric strategy takes a small step forward on oversight, with a board to evaluate the technology and review its findings, but it meets too infrequently—three times since last July, as far as I can tell—to have effective oversight of the operational use of the technology. In any case, it is clearly not designed to provide operational safeguards, and that is where big questions remain about discriminatory use and effectiveness. The lack of operational safeguards and parliamentary scrutiny may lead to ill-judged uses of the technology.
I am hopeful that the Minister can assure us today of the Government’s intention to make things a lot clearer in this space, that existing and emerging technologies will be covered by clear, consistent guidance and legislation from the Home Office, that the relevant commissioner will have all the powers they need to regulate these technologies, and that our law enforcement agencies fully understand what they need to do, both before any technology or new method of data collection is rolled out, and afterwards, when an individual’s data rights may have been abused. We need clear principles, and I am not convinced that the legislative landscape as it stands provides that.
It is a great pleasure to serve under your chairmanship, Sir Roger. It was a wrench to come out of the climate change debate in the Chamber, but the debate here has shown that what we are discussing is extremely important. Before I start, I recognise the presence of the Chair of the Science and Technology Committee, the right hon. Member for North Norfolk (Norman Lamb), who has joined us. I will of course take an intervention if he wishes to speak.
I congratulate the hon. Member for Bristol North West (Darren Jones) on securing the debate and on his excellent speech, which was rooted in genuine passion, deep expertise and a lawyer’s ability to present a case and fillet the evidence. It was really interesting. Of course, the context, which the hon. Gentleman was very good at laying out, is huge. We will talk about the police and the attitude of the security services, but ultimately this is a debate about how we protect our personal freedoms in the digital age, to use the hon. Gentleman’s language, and that is an enormous issue. Some hon. Members have already volunteered the opinion that the public are not yet fully engaged with the issue, and I support that from the experience of my constituency, but it is a huge issue.
The other context that we have alluded to and must not lose sight of is the backdrop of the extraordinary acceleration of the pace of change in what technology now enables for good and evil. Therefore, the debate about how far we go in supporting our police system and our security system—those who get up every morning thinking about how they can protect us—in using technology for the power of good is extremely important.
The hon. Gentleman mentioned a fundamental issue that underpins the debate. His primary charge against the Government, which was echoed by others, was that the regulatory framework, the legal framework, the oversight arrangements and the governance framework were not fit for purpose. He also said that a fundamental challenge for any Government of any colour is finding ways to keep pace with what is going on out there and make sure that the checks and balances and protections and regulations that are put in place are fit for purpose, against a landscape that is changing all the time.
I am grateful to the Minister for giving way and for indicating that he was willing to give way. He is making some really important points. When the Biometrics Commissioner gave evidence to our Committee, he gave a clear view that many in the police want a clear statutory framework that they can operate within. They do not want to be uncertain as to whether the next step they take will be subject to legal challenge. Surely it is in everyone’s interests to have a clear statutory framework and to do that now.
I understand that point. Although I technically do not lead on this area in the Home Office, in the context of another meeting with many of the chiefs directly involved, I have heard them talk a bit about it. They have not expressed that view directly to me, but that is not good enough. I will go back to them and get their direct view.
The hon. Member for Bristol North West spoke about his case and the legal framework for that. As that is about to be tested through a legal challenge in May, he will know there is a limit to what I can say. I am very up-front in saying that in reviewing the landscape, it is quite clear to me that some of the oversight and governance arrangements are not clear enough. A considerable amount of work is going on in the Home Office to try and improve that situation. That will become clearer over the summer, and I will talk on that.
The other context, if we come specifically to the work of the police—which is what we are basically talking about—is the use of biometrics and data to identify people, based on their personal characteristics. Those data are used by the private sector and the police and are very much part of our day-to-day life, as Members have said in relation to users of Facebook and Google and companies that basically make money out of their information about us. It is part of our day-to-day experience.
As the shadow Minister knows, biometrics have been an essential tool for the police for many years. If we consider that in one year alone, DNA linked more than 32,000 people to crimes, including 700 relating to murders and 700 to rapes, it sharpens the importance of this agenda for those trying to keep the peace and to protect us. For any Government of any colour who recognise that the security of the public will always be a priority, if not the priority, the question of our responsibility and how far we go to ensure that the police can continue to use biometrics and make use of the most up-to-date technologies will always be a priority.
Members have talked about the attitudes of the public, and I am sure they are right. The data I see, whether it comes from Lincolnshire’s police and crime commissioner or the Mayor’s Office for Policing And Crime, reinforces what Members have said. If members of the public are asked, “Should the police use these technologies to catch criminals?”, the answer tends to be yes, particularly in the context of the most serious crimes. We understand that, but that needs to be offset by a much more open and clear debate on the checks, balances and transparency around the use of these technologies. I absolutely understand that, but the pace of change and the opportunity are genuinely exciting for the police services. What is happening with new mobile fingerprint checkers, for example, is transformative in what they allow the police to do, including the pace with which they can work and the time that they can save by harnessing these technologies. Any Government would want to support their police system in making the best use of technology to protect the public and catch criminals, particularly those involved in the most difficult crimes.
Facial recognition is clearly a massively sensitive issue, and I welcome this debate. We have supported the principle of the pilots, and we can debate the degree to which the appropriate guidance and supervision have been in place for that. It is clear to the police and us that there are real opportunities to make use of facial matches. Generations of police officers have used photographs of people to identify suspects for a long time, and CCTV images have been a vital tool in investigation, but what is changing is our ability to match images with increasing confidence and speed. That is the major change in the technology. In a recent example, images taken by a member of the public in a Coventry nightclub where a murder took place were quickly matched on the police national database to a known individual who was arrested. They found the victim’s blood on his clothing and he is currently serving life imprisonment. We need to be clear about where the opportunity is in terms of matching suspect images on the national police database to wanted known individuals, ensuring that they cannot evade justice when they cross force boundaries.
It is understandable that the use of live facial recognition technology, which is the heart of the debate, raises extremely legitimate privacy concerns. Speaking not only as a Minister or a Member of Parliament but as a member of the public, I absolutely understand and share those concerns. A fundamental part of our democratic process is that those concerns are expressed here in the House or in the courts. The hon. Member for Bristol North West alluded to that. He wants us to go much further on transparency, accountability, governance and oversight, and I will try to set out the progress we hope to make on that, but the fact is that in many countries, these debates just would not take place. It is a strength of our system that we are sitting here in this debating chamber and the Minister is forced to come here and respond, that the Select Committee is able to do the work it does, and that the Government of the day show the Committee the respect it is due. That is our process, and it is not bad.
On the retention of images, the Government and successive Governments have been clear that DNA and fingerprints are not retained where someone is not prosecuted and is, in effect, an innocent person; yet with facial recognition, the facial images are retained. There is a mechanism for someone to apply to have their image deleted, but the indication is that people are not routinely told about that. What can possibly be the justification for having a very clear rule applying to DNA and fingerprints and a different rule applying to facial recognition? When are we going to get to the point where there can be automatic deletion of the images of innocent people?
I will come to that point, because I know it was a particular focus of the Committee, but first I want to conclude my remarks on facial recognition. The police have responsibilities and duties to prevent, detect and investigate crimes. The police have broad common law powers, as we are aware, that allow them to use relevant technologies, such as surveillance cameras in public places and live facial recognition, but it is clear that such use is not unfettered. The police have to exercise their powers in accordance with the law, including the Police and Criminal Evidence Act 1984, the Human Rights Act 1998 and data protection legislation.
As was alluded to, we also carry out data protection impact assessments before using a new biometric technology and before a new application of an existing technology. That includes inviting scrutiny from an independent ethics panel, regulators and commissioners. I was listening today to the chiefs of one of our largest police forces speaking exactly to that point, when he talked of the importance he attaches to the opinion of his local ethics panel. We will produce DPIAs for each element of the Home Office biometrics programme and the police will produce DPIAs for each use of live facial recognition.
When it comes to the use of surveillance cameras, the police are required to have regard to the surveillance camera code. To support them in using that technology, they can draw on the guidance of the Surveillance Camera Commissioner and the Information Commissioner. Recognising concerns around the use of the new biometrics, we have set up a new oversight board that includes the Biometrics Commissioner, the Information Commissioner and the Surveillance Camera Commissioner. It will oversee new pilots and is reviewing police operational guidance for live facial recognition. There is a recognition in the system of the issues raised by Members, and mechanisms are in place.
However, I have been clear that the current arrangements are complex for both users and the public. We are therefore keen to simplify and extend the governance and oversight arrangements for biometrics. As I have said, we will update Parliament in the summer on that work. There is a limit to what I can say at the moment, but I hope that Members can take comfort from the fact that we recognise that their concerns are valid, and that, as I said, there is an active stream of work to try to simplify and extend the governance and oversight arrangements for biometrics, against a background of rapid change in the landscape.
The policy on custody images was established in the 2017 review, and allows people who have been arrested but not convicted to ask the police to remove their custody images. There is a strong presumption in favour of removal. It is critical that people are aware of their rights, and debates such as today’s, as well as the work of non-governmental organisations, help to increase that awareness.
The policy is public and set out on gov.uk, and is covered in the management of police information and authorised professional practice guidance. However, we cannot rely on that, and we need to go further. The police will tell all those who go into custody about the policy through information that they hand out. We will also review the policy, and use a current police audit of requests to inform our conclusions. I undertake that the views of the House will also be taken into account.
The hon. Member for Bristol North West and the Chair of the Science and Technology Committee spoke about automatic deletion of data for people who are not convicted. The Committee Chair will be aware that Baroness Williams of Trafford, who leads on the issue in the Home Office, has written to the Committee to give a further explanation of, frankly, the complexity underlying the issue. There is no debate about where we want to get to: we want to move to a system that is automatic. Her letter to the Committee, which I will share with the shadow Minister out of courtesy, sets out some of the complexities in delivering the timeline for which Members are reasonably asking.
As I understand it, the fundamental issue is that, unlike the arrangements for DNA and fingerprints, there is no single national system for custody images, with a unique identifier for every record. Many records have the appropriate identifier, enabling them to be linked to arrest records. However, there are several million on the police national database that cannot be linked easily, or at all. They would have to be manually reviewed or deleted in bulk, entailing many thousands of hours of work.
There is therefore an issue surrounding the different ways in which police databases work, and a fragmented landscape of local police force systems and different practices. It is genuinely complicated work. There is no quick fix, but I am satisfied that there is a determination to get to the end objective that we all want. In the meantime, we will work with the police to improve their procedures to better comply with the agreed policies. I will press the system harder on that, because obviously the current system is not satisfactory, or acceptable to me.
I will leave a few minutes for the hon. Member for Bristol North West to wind up, but I stress that biometrics, as the shadow Minister knows, play a fundamental role in many aspects of modern life and a vital role in the work of police, and have done for an extremely long time. We have a duty, as a Government and a Parliament, to support the protectors of the peace by ensuring that they can make use of new technologies in the most appropriate way. However, we must do our duty by the public we serve by ensuring that there are the right checks and balances in the process.
Ultimately, the public we serve and protect have to trust the process and continue to trust the police. We know the importance of trust in the modern age. Strikingly, the public we serve and represent continue to have high levels of trust in the police, whereas it has plummeted for many other traditional institutions. Trust in the police remains high, and it is important to me, and to anyone who will do my role in the future, that we maintain it. The inappropriate use of technology, or a lack of trust concerning how technologies are used in the future, is therefore a core challenge that the Home Office, under any colour of Administration, needs to take extremely seriously.
As the Home Secretary has said, we are not a surveillance state and have no intention of becoming one. That means that we must use new technologies in ways that are sensitive to their impact on privacy, and ensure that their use is proportionate and maintains the public trust that is at the heart of our policing model.
I thank the hon. Member for Strangford (Jim Shannon), my hon. Friends the Members for Stretford and Urmston (Kate Green) and for Newcastle upon Tyne Central (Chi Onwurah), and the shadow Minister, my hon. Friend the Member for Sheffield, Heeley (Louise Haigh), for their contributions. I also welcome the interventions of the Chair of the Science and Technology Committee, the right hon. Member for North Norfolk (Norman Lamb), and the Minister’s responses.
It is clear from today’s debate that everyone, including the Minister and, by extension, the Home Office, agrees that we have some work to do, which is a good conclusion. I put it on record that the Select Committee is interested in the actions being taken by the Scottish Government in the biometric data Bill that the hon. Member for Cumbernauld, Kilsyth and Kirkintilloch East (Stuart C. McDonald) mentioned. We will keep a close eye on the work being done in Scotland, and think about what lessons we might learn in Westminster.
As my hon. Friend the Member for Stretford and Urmston said, if we have 29 million facial scans for one hit, we clearly need to have a better debate about the balance between impact and invasion of privacy. As many colleagues mentioned, the demand for a stronger regulatory system comes from not just police forces, commissioners, politicians and the public, but from the technology companies providing such solutions. They wrote to me in advance of the debate to say that they want to do the right thing, and would rather that there were a framework in which they can operate so that—no doubt for their own brand purposes—they are not pushing the envelope by delivering solutions that police forces and others may take too far.
I welcome the Minister’s commitment to privacy impact assessments. I am sure that we all welcome that confirmation. I understand that dealing with legacy IT systems is difficult; we have been talking about that on the Select Committee too. We encourage the Government not to put a sticking-plaster over old systems, but to invest in new ones, so that we are not just dealing with a legacy problem, but building something fit for the future. I look forward to reading the letter that the Minister referred to from Baroness Williams of Trafford.
The Minister said that it was good that in our system we can hold the Government to account and show our interest in such matters. It is clear from the debate, and from the Select Committee’s ongoing work, that we will continue to do so. We therefore look forward with anticipation to the further announcements that the Minister has committed to in the season of “summer”. Even though we do not quite know when that will start or end, we look forward to those announcements, and I thank him for his contribution.
Question put and agreed to.
That this House has considered facial recognition and the biometrics strategy.