Report (1st Day) (Continued)
Clause 8: Child’s consent in relation to information society services
7: Clause 8, page 5, line 19, at end insert—
“(2) The Secretary of State must as soon as practicable after the passing of this Act by regulations require the Commissioner to set standards for the age-appropriate design of relevant information society services accessed by children and that such standards are to be set out in a code in accordance with section (Age-appropriate design code).”
Before I turn to the amendments in my name and that of the Minister, the noble Lord, Lord Stevenson, and the noble Baroness, Lady Harding, I would like to recognise the extraordinary role of children’s charities led by the NSPCC; the Duke of Cambridge Task Force; child experts John Carr and Professor Sonia Livingstone; the Children’s Commissioner; and the remarkable support of colleagues on the Communications Committee and from all sides of this House, especially the noble Lords, Lord Storey and Lord Clement-Jones. Without these powerful voices, we would not be introducing a statutory code of age-appropriate design to the Bill.
These amendments are a step towards a better digital future for children. They introduce a code that will set out the standards by which online services protect children’s data. They set standards that are directly related to a child’s age and the vulnerabilities associated with that age. They clarify the expectation on services to design data practices that put the “best interests” of the child above any other consideration, including their own commercial interest. They establish the standards by which the Information Commissioner will judge services on behalf of child users. Crucially, they connect design of services with the development needs of children, recognising that childhood is a graduated journey from dependence to autonomy.
Amendment 109 states that the Information Commissioner must consult widely on an age-appropriate design code, in particular capturing the voice of children, parents, child-development experts, child advocates and trade associations. In doing so, she will have to determine if the use of GPS location services to hold, sell or share a child’s current or predicted location is in a child’s best interest. She will have to consider if privacy settings for children should be automatically set to private. She will have to consider if the service can justify the collection of personal data, such as a child’s school or home address, their birth date, their likes, dislikes, friends or photographs, in order to facilitate a specific activity being undertaken by that child. She will have to deconstruct terms, conditions and privacy notices in order to make them understandable by, and appropriate for, children of different ages. A six year-old needs different protections and information from a 15 year-old. She will have to consider, with the development stages of childhood in mind, whether paid-for activity such as product placement and marketing is transparent to a child user and what reporting and resolution processes should be offered to children.
Responding to the concern raised by my noble friend Lord Erroll, the code will set out the duty of online services to facilitate the child’s right to erasure under the GDPR, with or without the help of an adult. Perhaps most importantly, the commissioner will—for the first time—consider strategies used to encourage extended user engagement; that is, those design features variously called sticky, reward loops, captology and enrapture technologies that have the sole purpose of making a user stay online. These will be looked at from a child development point of view. The opportunity cost, the need for a rich balance of varied online experiences as well as the need to get offline with ease will all be given weight.
Finally, the amendment invokes the UNCRC. The age-appropriate design code must incorporate all the rights of children, and the responsibilities of adults to them, as set out in the charter. The code created by the amendment will apply to all services,
“likely to be accessed by children”,
irrespective of age and of whether consent has been asked for. This particular aspect of the amendment could not have been achieved without the help of the Government. In my view it is to their great credit that they agreed to extend age-appropriate standards to all children.
Amendment 111 states that the code must be laid before Parliament as soon as practicable, but no later than 18 months from the passing of the Bill. Amendments 112 and 113 confirm the negative resolution procedure. Amendment 114 allows the commissioner to update the code. In Committee, my noble friend Lady Howe raised the question of enforcement. Although the code is not mandatory for online services, it is mandatory for the ICO to take it into account when investigating breaches and taking enforcement action.
Amendment 110 puts the age-appropriate design code into Clause 121 and, consequentially, into Clause 123. This means that online services facing a complaint of any kind, which have not complied with the age-appropriate code, risk enormous enforcement consequences, including the spectre of fines of up to €20 million or 4% of annual global turnover. In Committee, doubts were raised that it was technically possible to regulate the digital environment, so I am particularly grateful to the noble Baronesses, Lady Lane-Fox and Lady Shields, to Sky and to TalkTalk, for making it clear that there is no technological impediment to effective design; it is simply a question of corporate and legislative will.
Self-regulation has not provided a high bar of data protection for children. On the contrary, we have seen a greedy disregard of children’s needs from some sections of the tech sector in their eye-watering data collection policies. The introduction of a statutory code makes very clear what is required of them, and although data protection is crucial, it is not the only issue that confronts children in the digital environment. The principle which these amendments establishes—that a child is a child, even online—must now be established in every aspect of a child’s digital life, as a cultural and legal norm.
On this subject, I urge the Government to take one further step in the Bill: the introduction of a super-claimant procedure provided for by article 80(2) of the GDPR, and supported by the ICO. Children need advocates in all areas of life, including the digital. We will, no doubt, return to that in the new year. In the meantime, I thank the Minister, DCMS officials, the Bill team, the Minister for Digital and the Secretary of State. Along with those whom I have already mentioned, they have reason to be proud of introducing age-appropriate design standards to the Bill. Above all, it is a necessity for a 21st century child to access the digital environment knowledgably, creatively and fearlessly.
I support my noble friend Lord Clancarty, who has an amendment in this group. I look forward to hearing from the Minister of the Government’s commitment to the aspects of design that the commissioner will consider; that children’s needs will be at the heart of this code; and a clear indication that enforcement will be a priority for the commissioner and robustly applied. I beg to move.
My Lords, I remind your Lordships of my register of interests in the digital space, not least as the ex-chief executive of TalkTalk and trustee of Doteveryone. I add my thanks to those of my noble friend Lady Kidron. I also thank her for her tireless campaigning on behalf of children, and the energy, drive and commitment that she has shown in bringing all of us on this journey. We definitely would not be here today without her. I also thank my noble friend the Minister and the ministerial team both here and in the other place and the noble Lord, Lord Stevenson. This is genuinely a team effort, both within this House and, as the noble Baroness, Lady Kidron, set out, among all the charities and organisations which work tirelessly to ensure that we protect the vulnerable in the digital world—most importantly in the case we are discussing today, our children.
A code of practice for age-appropriate design for digital services is a hugely important step. Every time I speak in this House I talk about how much I believe that the digital world is a force for good and of the opportunity it presents us, particularly as an open country which embraces new technology. We have a history of not just embracing new technology but of protecting the vulnerable as we do so. This amendment is an important landmark in that journey for the digital world as we need the digital space to be civilised, every bit as civilised as the physical world, and we struggle in debating how we ensure that the physical world is civilised.
Data is at the core of digital and therefore this amendment is at the core of building a civilised digital society as it recognises that children’s data needs must be addressed and that children need to have special protections in the digital world, just as they do in the physical world. We are taking a hugely important moral as well as legal step in our digital journey. However, a code of practice will make a difference only if it changes behaviours, and, in this case, changes the behaviours of very big and very small digital service providers. Sadly, we are debating this issue because self-regulation is not working. I certainly think it is sad that that is the case. I very much hope that this amendment will start to drive the right behaviours but it will do so only if has teeth. Therefore, when my noble friend the Minister replies, I would be interested to hear his interpretation of the powers that this amendment would give the Information Commissioner. We need it to give her position teeth. We need to ensure that the ICO has sufficient resources to conduct the consultation properly in a reasonable period of time to provide commercial businesses big and small with sufficient time to enable them to implement this measure for children. A code will be effective only if tech companies subsequently change their behaviour.
I still very much hope that this debate and the amendment itself demonstrate to technology companies big and small our commitment as a country to protect our children online, and our expectation that all businesses will play their part. I still firmly believe that the free market works in most cases. I hope that simply by setting this process in train, technology companies will start to implement some of the basic protections for children that we discussed in Committee. It will be so much easier for the ICO to implement these standards if many of the basic protections are already in place but, much more importantly, our children would be safer from tomorrow rather than in 18 months’ or two years’ time. I am delighted to see this amendment supported on all sides of the House.
My Lords, I am glad to support Amendment 7 and the related amendments in the name of my noble friend Lady Kidron. Like others, I commend her for her perseverance and commitment in ensuring that we see children flourish as they grow from the early years of digital interaction to adulthood.
In 2010, the annual Ofcom media report made no mention of tablet computers. In 2017, 21% of three year-olds have their own tablet. This is the world in which our children are growing up. We use the global term “children” easily, which under the United Nations Convention on the Rights of the Child means a person under the age of 18. As those years encompass such diverse development, the Information Commissioner has a considerable challenge ahead to identify design suitable to cover all those needs. I for one wish her well.
As I have made clear on many occasions, I am for positive use of the internet by children, and for resources which help parents raise their children in the digital age. With that preface in mind, I would like to ask some questions about these amendments to clarify the intentions and the way forward.
First, during the debates we have had on Clause 8, we have talked about children aged between 13 and 16. Amendment 109 refers to a code being developed for sites,
“which are likely to be accessed by children”.
I hope that my noble friend and the Minister will clarify which age group we are referring to, since there is no definition of children in the Bill but the terms “child” and “children” are used in the headings of Clauses 8 and 191, where the relevant age of the child is 13 and 12 respectively. As Amendment 109 refers to the UNCRC, I assume that the intention is that the age-appropriate design code of practice will cover all children up to the age of 18. However, it would be very helpful for a definition of children to be included in the relevant clauses so that there is no uncertainty.
Secondly, I hope that there will be clarification of which sites will fall within the requirements of the code. Clearly, the expectation is that the code will go beyond sites which would require the consent of children, but will it apply only to sites whose primary intention is to reach children? For instance, in the last couple of weeks, Facebook has launched a chat app for children who are not old enough to be signed up to Facebook. The new app is aimed at six to 12 year-olds. Will the new code apply just to this app or to the version of Facebook that permits access by those aged 13 and above as well?
On 23 November, this House discussed online problem gambling. A number of interventions were made by noble Lords on online gambling sites that have games involving cartoon characters which look similar to characters in children’s TV, and most certainly appeal to children. When the Times reported on these games, the chief executive of the Remote Gambling Association said that companies were not deliberately targeting children but that some nostalgic games might inadvertently be attractive to them. I hope that the position of these sites under the code, which in theory should not be accessible to children but clearly are, will also be addressed.
Thirdly, how will sites complying with the age-appropriate design be obvious to parents, especially to parents who consent to their child’s use of any data? In this context, will the new code be incorporated into the next draft of the Internet Safety Strategy? Finally, how will the code be enforced? Without some good enforcement mechanism, it is likely that it will not have as wide-reaching an impact as this House hopes that it will.
These amendments have come at a late stage in our consideration of this Bill. I look forward to hearing what my noble friend and the Minister have to say in response to my questions. I hope that the other place will continue to reflect on the proposal before us today and refine it if necessary. I hope too that it will continue to ask questions about whether the digital age of consent of 13 is the most appropriate age, and that there will be satisfactory evidence that 13 is in the best interests of our young people.
The internet puts the world at the fingertips of our children. I commend my noble friend Lady Kidron for working to ensure that children are able to make the most of this amazing resource in a way that supports child development.
My Lords, I thank the noble Baroness, Lady Kidron, for moving these amendments with such incredible clarity that I was able to understand what they were saying. My question follows on from the point made by the noble Baroness, Lady Howe, about how these amendments would be enforced. As the noble Baroness, Lady Jay of Paddington, said in Committee, all these issues arise in an international context. How will the international dimension work with regard to these amendments? I would be concerned if we were to impose rules in this country which might create divergence from the GDPR and hence make it more difficult to achieve the eventual accommodations with the European Union that would allow us to continue to do business with it in the longer term. There is an international dimension to all this and I do not understand how it would work with regard to these amendments.
My Lords, not for the first time in her distinguished career in this House, the noble Baroness, Lady Howe, has asked some pertinent questions, the answers to which I look forward to. First, however, I pay tribute to the noble Baroness, Lady Kidron. It is quite often difficult for a parliamentarian to know whether they have made a difference; we all get swept up in the tide of things. However, I have looked at the Bill as it has moved through both the other place and here, and without her intervention, her perseverance and her articulate exposition of the case, we would not be where we are today. She should take great credit for that.
In some respects, there is a sense of déjà vu. I am glad to see the noble Lord, Lord Puttnam, in his place; I was on his committee 15 years ago which looked at the Communications Act and the implications of what were then new technologies. However, looking back, the truth is that we had only an inkling of the tsunami of technology that was about to hit us and how we would control it. There are some things that we might have done during the passage of that Bill to anticipate some problems that we did not do. However, it is always difficult to know the future. Indeed, of all the things I have had a bit to do with, the creation of Ofcom is one that I take great pride in. For all its problems, Ofcom has proved itself a most effective regulator, and these days it seems that it is asked to do more and more.
That brings us to what is being suggested with the ICO. It is extremely important that the ICO is given the resources, the teeth and the political support to carry out the robust tasks that we are now charging it with. That was not thought of for the ICO when it was first created. We are therefore creating new responsibilities, and we have to will the ends in that respect.
One of the good things about the amendments in the name of the noble Baroness, Lady Kidron, is that this is beginning slightly to impinge on the tech companies—they cannot exist in a kind of Wild West, where anything goes. I think I said at an earlier stage that when I hear people say, “Oh well, the internet is beyond political control and the rule of law”, every fibre of my being as a parliamentarian says, “Oh no it’s not, and we’ll show you that it’s not”. This is a step towards making it clear to the tech companies that they have to step up to the plate and start developing a sense of corporate social responsibility, particularly in the area of the care of children.
I share with the noble Baroness, Lady Harding, the confidence that with these new technologies we have an enormous potential for good. However, the jury is still out on whether we will create a civilised space or not. It would depend on how we legislate, what powers we give to the regulator, which we give teeth to, and on how smart the tech companies are in responding to this genuine public concern. If they do not understand that people are concerned, and that once they get concerned they will get angry, they will diminish their opportunities to play a constructive role in what can be a quite transforming technological revolution, which we are about to experience.
My Lords, I add my voice in congratulating the noble Baroness, Lady Kidron, on her amendment and on the way it was presented. I will try to add additional value to the discussion. I, along with the noble Baronesses, Lady Harding, Lady Shields, and Lady Lane-Fox, have spent a lot of the time—in my case, 20 years—defending and promoting the tech industry. I believe in the tech industry and in its educational capacity and many of the developments it can produce. I also have many friends in the tech industry, which makes it doubly difficult. That is why I find it so difficult to understand why they are not part of this.
One reason, which is important but which has not been mentioned, is that these are the UK subsidiaries of major global businesses. When well-meaning people in the UK look at this problem and would probably like to address it, they get barked at down the phone by someone who has no conception of the strength of feeling in this House or in the UK and Europe, and so they do not get a sympathetic hearing. By passing this amendment, this House can send a message back to the west coast of the United States to say, “I’m very sorry—your values do not prevail here. We’re looking for something different: a tech industry that supports, enhances and encourages the type of society that we all want to be part of”. It is important to get that message back.
It is not just us saying that. David Brooks, the eminent journalist for the New York Times, ended his piece on 20 November by saying:
“Tech will have few defenders on the national scene. Obviously, the smart play would be for the tech industry to get out in front and clean up its own pollution”.
That is the intelligent view. The tech industry I have promoted and believe in will get out in front and understand the signal that is being sent from this House, and will begin to do something about it. It will be quite surprising what they can do, because in a sense we may well be helping the senior executives in Europe to get their message back to the west coast of the United States. That is one important reason why I support the amendment.
My Lords, I cannot add much to what the noble Baroness, Lady Kidron, said when she took us on her concise comprehensive canter through her amendments, but I will mention two things.
The first is in response to the noble Lord, Lord Arbuthnot, who is right to say that enforcement is essential, particularly because it is international—the internet is international. We faced this with Part 3 of the Digital Economy Act in trying to prevent children getting pornography. One of the things that became apparent is that the payment services providers are good on this sort of thing, and if it looks right and the community agrees it, they will withdraw payment services from people who do not comply. As most websites are out there to make money, if they cannot get the money in, they quickly come into line. So there may be some enforcement possibilities in that area, as it ends up being international.
The other thing we noticed is that the world is watching us in Britain because we are leading on a lot of these things. If we can make this effective, I think other countries will start to roll it out, which makes it much easier to make it effective. It is a big question because at the end of the day we are trying to balance the well-meaning desire of the developers and those producing these apps, who want to deliver a ubiquitous, useful utility everywhere, with the protection of the young. That is a difficult thing to do, which is why this has to remain flexible. We have to leave it up to someone who is very wise to get us there. If we get it right, this could be a very good step forward.
My Lords, I rise in support of these amendments, as if any further support were needed. I speak as the Member of your Lordships’ House who chaired your Communications Committee when we produced our report, Growing Up With the Internet. My noble friend Lady Kidron was a most distinguished member of the committee and greatly helped us in formulating our recommendations. Alongside support for parents and schools and other measures, the committee sought government intervention in curbing the poor practices of the organisations providing content and delivering the internet’s services to children, especially through social media. This group of amendments takes forward that central theme from the committee’s report, and I thank my noble friend and congratulate her on her foresight and tenacity in pursuing this. I also thank the Minister, backed by his Secretary of State, for supporting these amendments today.
The underlying significance of the amendments is that they establish a process for government—for society—to intervene in determining the behaviour of those responsible for internet services that can have such a huge impact on the lives of our children. In particular, the new process will cover the activities of huge global companies such as Facebook and Google, among the most prosperous and profitable organisations on the planet, which have the power, if only they would use it, to ensure the safety and well-being of children online. The process set in train by these amendments involves empowering the Information Commissioner to set the standards that all the key players will be expected to adopt or face significant sanctions. The amendments mark a necessary shift away from depending on good will and purely voluntary self-regulation. They represent a breakthrough in holding to account those mighty corporations based far away in Silicon Valley, to which the noble Lord, Lord Puttnam, made reference, and others closer to home. It is good to see major organisation such as Sky and TalkTalk supporting such a change, alongside the major charities such as the NSPCC.
Your Lordships’ Communications Committee and the whole House owe a huge debt of gratitude to my noble friend Lady Kidron for so diligently taking forward the arguments that have led to the significant change which these amendments herald. I know that the committee, as well as all those concerned with the safety and well-being of the nation’s children, will greatly welcome this big step towards ensuring better behaviour from all the relevant commercial enterprises. I suggest that this is a major step in protecting not just children in the UK but children around the world as the value of this kind of intervention becomes recognised, as the noble Earl, Lord Erroll, mentioned. The amendments get my fulsome support.
My Lords, I will speak to Amendment 117 in my name, but before I do I warmly congratulate my noble friend Lady Kidron on obtaining this important code of practice for children. I apologise for not having spoken in the debate on this Bill previously, but Amendment 117 is significant and is also a children’s rights issue.
If there is to be—correctly—a sensitivity concerning age-appropriate understanding by children in relation to information services, the same should be no less true in the school setting, where personal data given out ranges from a new maths app to data collected by the DfE for the national pupil database. A code of practice needs to be introduced that centres on the rights of the child—children are currently disempowered in relation to their own personal data in schools. Although not explicitly referred to in this amendment, such a code ought to reflect the child’s right to be heard as set out in Article 12 of the UN Convention on the Rights of the Child. Among other things, it would allow children, parents, school staff and systems administrators to build trust together in safe, fair and transparent practice.
The situation is complicated in part by the fact that it is parents who make decisions on behalf of children up to the age of 18; although that in itself makes it even more necessary that children are made aware of the data about themselves that is collected and every use to which that data may be put, including the handing on to third-party users, as well as the justification for so doing. The current reality is that children may well go through life without knowing that data on a named basis is held permanently by the DfE, let alone passed on to others. There may, of course, be very good research reasons why data is collected, but such reasons should not override children’s rights, even as an exemption.
It is because there is no clear code of practice for a culture of increased data gathering in the school setting that we now have the current situation of growing controversy, enforcement and misuse. It is important, for instance, that both parents and children, in their capacity to understand, are made aware—as schools should be—of what data can be provided optionally. However, when nationality and place of birth were introduced by the DfE last year, many schools demanded that passports be brought into the classroom. In effect, the DfE operated an opt-out system. The introduction of nationality and place of birth data also raises the question of the relevance of data to improving education and its ultimate use. Many parents do not believe that such data has anything to do with the improvement of education. Last week, Against Borders for Children, supported by Liberty, launched an action against the Government on this basis.
There is now also considerable concern about the further expansion of the census data in January next year to include alternative provision data on mental health, pregnancy and other sensitive information without consent from parents or children, with no commitment to children’s confidentiality and without ceasing the use of identifying data for third-party use.
It was only after FOI requests and questions from Caroline Lucas that we discovered that the DfE had passed on individual records to the Home Office for particular immigration purposes. As defenddigitalme said, such action,
“impinges on fundamental rights to privacy and the basic data protection principles of purposes limitation and fairness”.
I appreciate that as the Bill stands such purposes are an exemption, but teachers are not border guards.
In 2013, a large number of records were passed to the Daily Telegraph by the DfE. In an Answer given on 31 October this year by Nick Gibb to a Question by Darren Jones, he incorrectly said that individuals could not be identified. There is no suggestion that there was any sinister intent, but many parents and schoolchildren would be appalled that a newspaper had possession of this data or that such a transfer of information was possible. Moreover, in the same Answer he said that he did not know how many datasets had been passed on. This is unacceptable. There needs to be a proper auditing process, as data needs to be safe. It is wrong too that a company may have more access to a pupil’s data than the pupil themselves, or indeed have such data corrected if wrong.
It is clear that from the Government’s point of view, one reason for having a good code of practice is to restore confidence in the Government, but this should not be the main reason. In September, Schools Week reported that the Information Commissioner’s Office was critical of the current DfE guidance, which is aimed at schools rather than parents or children and is, in the main, procedural. It said that rights were not given enough prominence. Both children and parents need to be properly informed of these rights and the use to which data is put at every stage throughout a child’s school life and, where applicable, beyond.
My Lords, I add my very strong welcome for this amendment to the very strong welcome from these Benches. I endorse everything that my noble friend Lord McNally said about the noble Baroness, Lady Kidron, and her energy and efforts. In fact, I believe that she was far too modest in her introduction of the amendment. I agree with the noble Lord, Lord Best, that, quite honestly, this is essentially a game-changer in the online world for children. As he said, the process of setting standards could be much wider than simply the UK. As the noble Lord, Lord Puttnam, said, these major tech companies need to wake up and understand that they have to behave in an ethical fashion. Having been exposed to some of the issues in recent weeks, it is obvious to me that as technology becomes ever more autonomous, the way tech companies adopt ethical forms of behaviour becomes ever more important. This is the start of something important in this field. Otherwise, the public will turn away and will not understand why all this is happening. That will inevitably be the consequence.
Perhaps I may add something to what the noble Baroness, Lady Kidron, said. As I said, she is too modest. I have read her report, Digital Childhood—Addressing Childhood Development Milestones in the Digital Environment, which contains lessons the Government should take on board for their internet safety strategy. I hope the Minister will read it himself. There is a useful passage which states:
“We cannot solely rely on the digital resilience of children. Industry and government must adapt the digital environment to make it fit for children by acting above and beyond commercial consideration”.
That is exactly what the amendment does. However, it is applicable also to many other aspects of the internet safety strategy. Many other points are made in the report: that the Government should use childhood development milestones to determine their policy-making process, that industry should commit to delivering an age-appropriate digital agency to children even when it challenges its own commercial interests, and so on.
The noble Baroness is far too modest. I read the report with huge admiration for the work that has gone into it and the principles enunciated in it, which can inform a future digital safety strategy. I hope it is required reading for anyone within the DDCMS who is charged with taking forward the digital safety strategy.
On Amendment 117, tabled by the noble Earl, Lord Clancarty, I come rather late to the party. However, he made a strong case—there is a crucial case to answer. I know that 20 organisations have written to the Prime Minister on this issue today. The new national data collection in the alternative provision census comes into effect in January, with new labels being added to children’s records and the national pupil database. This database is currently in use. Apparently more than 1,000 requests for the use of confidential pupil records in the database have been approved by the Department for Education since March 2012. It is not a theoretical database and the Department for Education has to take responsibility for it. The points made by the noble Earl need answering—if not by the Minister then certainly by the Department for Education.
My Lords, we have had a good discussion this evening about topics raised in Committee, where the strength of feeling and expertise displayed was highly instrumental in persuading Ministers to think again about the approach they were taking towards the regulatory process for children’s data being transferred into the internet. It shows that well-argued cases can get through even the most impervious armour put on by Ministers when they start battling on their Bills. I am delighted to see it.
The noble Lord, Lord Clement-Jones, commented on Amendment 117, tabled by the noble Earl, Lord Clancarty. I wondered why that amendment had been included in the group because it seemed to point in a different direction. It deals with data collected and used by the Government, having cleared what would presumably be the highest standards of propriety in relation to it. However, the story that emerged, endorsed by the noble Lord, Lord Clement-Jones, is shocking and I hope that the Minister will be able to help us chart a path through this issue. Several things seem to be going wrong. The issues were raised by my noble friend Lord Knight in Committee, but this amendment and the paperwork supplied with it give me a chill. The logic behind the amendment’s being in this group is that this is the end-product of the collection of children’s data—admittedly by others who are providing it for them in this case—and it shows the kinds of dangers that are about. I hope that point will be answered well by the Minister when he comes to respond.
I turn to the substantive amendment; it is an honour to have been invited to sign up to it. I have watched with admiration—as have many others—the skilful way in which the noble Baronesses, Lady Kidron and Lady Harding, and others have put together a case, then an argument and then evidence that has persuaded all of us that something can be done, should be done and now will be done to make sure that our children and grandchildren will have a safe environment in which they can explore and learn from the internet.
When historic moments such as this come along you do not often notice them. However, tonight we are laying down a complete change in the way in which individuals relate to the services that have now been provided on such a huge scale, as has been described. I welcome that—it is an important point—and we want to use it, savour it and build on it as we go forward.
I first sensed that we were on the right path here when I addressed an industry group of data-processing professionals recently. Although I wowed them with my knowledge of the automatic processing of data and biometric arguments—I even strayed into de-anonymisation, and got the word right as I spoke in my cups—they did not want anything to do with that: they only wanted to talk about what we were going to do to support the noble Baroness, Lady Kidron, and her amendments. When the operators in industry are picking up these debates and realising that this is something that they had always really wanted but did not know how to do—and now it is happening and they are supporting it all they can—we are in the right place.
The noble Baroness, Lady Harding, said something interesting about it being quite clear now that self-regulation does not work—she obviously has not read Adam Smith recently; I could have told her that she might have picked that up from earlier studies. She also said, to redeem herself, that good regulation has a chance to change behaviour and to inculcate a self-regulatory approach, where those who are regulated recognise the strength of the regulations coming forward and then use it to develop a proper approach to the issue and more. In that sense she is incredibly up to date. Your Lordships’ House discussed this only last week in a debate promoted by the noble Baroness, Lady Neville-Rolfe, on what good regulation meant and how it could be applied. We on these Benches are on all fours with her on this. It is exactly the way to go. Regulation for regulation’s sake does not work. Stripping away regulation because you think it is red tape does not work. Good regulation or even better regulation works, and that is where we want to go.
There are only three points I want to pick out of the contribution made by the noble Baroness, Lady Kidron, when she introduced the amendment. First, it is good that the problem we saw at the start of the process about how we were going to get this code applied to all children has been dealt with by the Government in taking on the amendment and bringing it back in a different way. As the noble Baroness admits, their knowledge and insight was instrumental in getting this in the Bill. I think that answers some of the questions that the noble Baroness, Lady Howe, was correctly asking. How do the recommendations and the derogation in the Bill reducing the age from 16 to 13 work in relation to the child? They do so because the amendment is framed in such a way that all children, however they access the internet, will be caught by it, and that is terrific.
The second point I want to make picks up on a concern also raised by the noble Baroness, Lady Harding. While we are probably not going to get a timescale today, the Bill sets a good end-stop for when the code is going to be implemented. However, one hopes that when the Minister comes to respond, he will be able to give us a little more hope than having to wait for 18 months. The amendment does say,
“as soon as reasonably practicable”,
but that is usually code for “not quite soon”. I hope that we will not have to wait too long for the code because it is really important. The noble Baroness, Lady Harding, pointed out that if the message goes out clearly and the descriptions of what we intend to do are right, the industry will want to move before then anyway.
Thirdly, I turn to the important question of how the code will be put into force in such a way that it makes sure that those who do not follow it will be at risk. Yes, there will be fines, and I hope that the Minister is able to confirm what the noble Baroness asked him when introducing her amendment. I would also like to pick up the point about the need to ensure that we encourage the Government to think again about the derogation of article 82. I notice in a document recently distributed by the Information Commissioner that she is concerned about this, particularly in relation to vulnerable people and children, who might not be expected to know whether and how they can exercise their rights under data protection law. It is clear that very young people will not be able to do that. If they cannot or do not understand the situation they are in, how is enforcement going to take place? Surely the right thing to do is to make sure that the bodies which have been working with the noble Baroness, Lady Kidron, which know and understand the issues at stake here, are able to raise what are known as super complaint-type procedures on behalf of the many children to whom damage might be being done but who do not have a way of exercising their rights.
If we can have a response to that when we come to it later in the Bill, and in the interim get answers to some of the questions I have set out, we will be at the historic moment of being able to bless on its way a fantastic approach to how those who are the most vulnerable but who often get so much out of the internet can be protected. I am delighted to be able to support the amendment.
My Lords, first, like other noble Lords, I pay tribute to the noble Baroness, Lady Kidron, for her months—indeed, years—of work to ensure that the rights and safety of children are protected online. I commend her efforts to ensure that the Bill properly secures those rights. She has convinced us that it is absolutely right that children deserve their own protections in the Bill. The Government agree that these amendments do just that for the processing of a child’s personal data.
Amendment 109 would require the Information Commissioner to produce a code of practice on age-appropriate design of online services. The code will carry the force of statutory guidance and set out the standards expected of data controllers to comply with the principles and obligations on data processors as set out by the GDPR and the Bill. I am happy to undertake that the Secretary of State will work in close consultation with the Information Commissioner and the noble Baroness, Lady Kidron, to ensure that this code is robust, practical and, most importantly, meets the development needs of children in relation to the gathering, sharing, storing and commoditising of their data. I have also taken on board the recommendations of the noble Lord, Lord Clement-Jones, on the internet safety strategy. We have work to do on that and I will take his views back to the department.
The Government will support the code by providing the Information Commissioner with a list of minimum standards to be taken into account when designing it. These are similar to the standards proposed by the noble Baroness in Committee. They include default privacy settings, data minimisation standards, the presentation and language of terms and conditions and privacy notices, uses of geolocation technology, automated and semi-automated profiling, transparency of paid-for activity such as product placement and marketing, the sharing and resale of data, the strategies used to encourage extended user engagement, user reporting and resolution processes and systems, the ability to understand and activate a child’s right to erasure, rectification and restriction, the ability to access advice from independent, specialist advocates on all data rights, and any other aspect of design that the commissioner considers relevant.
The new age-appropriate design code interlocks with the existing data protection enforcement mechanism found in the Bill and the GDPR. The data protection principles apply equally to children and are applied by data controllers on the basis of guidance provided by the commissioner. The GDPR makes clear that children merit specific protection with regard to their personal data as they may be less aware of the risks and consequences. The code will establish the standards required of data controllers to meet this obligation. The status of a statutory code means that any organisation that ignores it is taking a significant legal risk.
The Information Commissioner considers many factors in every regulatory decision, but non-compliance with this code will weigh particularly heavily for a non-compliant website or app maker. Organisations that wish to minimise their risk of being penalised up to £18 million or 4% of global turnover will apply the code; it would be foolhardy not to do so. I hope the noble Lords, Lord McNally and Lord Puttnam, can take some comfort from that. The new code on age-appropriate design will have the same proven enforceability as our codes on direct marketing and data sharing, issues we take extremely seriously. My noble friends Lady Harding and Lord Arbuthnot, the noble Lord, Lord Puttnam, and the noble Baroness, Lady Howe, along with many other noble Lords, have asked in effect whether the codes have teeth. We say that they do.
The principle-based regulatory approach sets few rules so the ICO produces guidance as to how those principles must be observed. Organisations may find alternative ways of meeting the requirements, but will need to demonstrate compliance. If they do nothing, they risk breaking the law. While all ICO guidance has teeth, statutory codes have sharper teeth. They can be used in evidence in any legal proceedings, not only data protection proceedings. In determining a question arising from proceedings, courts and tribunals must take into account any part of the code that appears to them relevant to that question. In carrying out her functions, the Information Commissioner must also take the code into account. When investigating a breach of data protection law, the commissioner has to decide whether the data controller acted reasonably. In conducting this balancing test, the failure to comply with a statutory code will weigh heavily against the controller. In areas where there is competing guidance such as that produced by self-regulators—for example, the Institute of Fundraising, IPSO and so on—statutory guidance takes precedence.
It is proven to work. The new age-appropriate design code will be the commissioner’s third statutory code. The first was the data sharing code, which was originally provided for in the Coroners and Justice Act 2009 and will be re-enacted by Clause 119 of the Bill. This is a key tool to ensure that data controllers comply with the law. The commissioner asked for her guidance on direct marketing to be given the status of a statutory code to enable tough enforcement. The Government delivered the statutory code in the Digital Economy Act 2017 and it will be re-enacted by Clause 120 of this Bill.
Amendment 111 makes clear that the new code will be laid as soon as possible and no more than 18 months after the passing of the Bill. I have noted the comments of the noble Lord, Lord Stevenson, and I have said that it will be laid as soon as possible. The Information Commissioner, working with the Government, the noble Baroness, Lady Kidron, and a range of concerned stakeholders, will use that time to get it right and to ensure that they are the best possible rules to protect children while allowing them to continue to enjoy the benefits of the internet and be full citizens of the digital age. I hope the House can see that this position has been developed with a concern to ensure that children in the UK are granted a robust data regime so that they can access online services in a way that meets their age and development needs. This is and will remain a top priority for the Government. Again, I thank the noble Baroness for her amendments and, much more importantly in some ways, her leadership on this matter.
Before I leave this, the noble Baroness, Lady Kidron, and the noble Lord, Lord Stevenson, mentioned article 80(2) on representation of data subjects. This is coming up. Our view is that the Data Protection Bill provides sufficient recourse for data subjects by enabling them to give consent to non-profit organisations to represent their interests in the event that their rights have been infringed. As I said, I am sure we will have a chance to debate that on the third day of Report.
While we are very pleased to support the noble Baroness’s Amendments 109, 111, 112 and 114, we also have a number of government amendments in this group. Amendments 110, 113, 115 and 116 are technical amendments to ensure that the noble Baroness’s amendments are correctly stitched into the framework for creating and enforcing codes of practice, consistent with other statutory codes that the commissioner must produce. I will move those amendments.
Finally, we believe that Amendment 117 in the name of the noble Earl, Lord Clancarty, is unnecessary. The sharing of individual-level pupil data is already highly regulated in law, specifically in Section 537A of the Education Act 1996. The Department for Education takes its data protection responsibilities seriously. To comply with the law, it has developed a rigorous process around data sharing of individual records. It does not share data simply because it is lawful to do so; it shares data only where it is both lawful and ethical to do so. As part of the approvals process, officials, including legal experts and senior civil servants with data expertise, assess the application for public benefit, proportionality—ensuring the minimum amount of data is used to meet the purpose—and legal underpinning, and so that the strict information security standards we enforce have been satisfied.
The Department for Education goes to great lengths to be transparent and accountable in data sharing. Since December 2013, it has provided summary information about data sharing requests that have been approved through the existing NPD Data Management Advisory Panel, providing public transparency about the sharing of individual pupil-level data.
In preparation for the GDPR coming into force, the department is actively reviewing its data-sharing processes with third parties to ensure greater security, consistency, accountability and transparency around data sharing. Before May 2018, the department will review its existing arrangements and processes for sharing sensitive personal data to date to ensure they are compliant with the incoming regulations, and review them regularly thereafter. As part of that work and to ensure citizens have even greater oversight of the department’s data, on 14 December the department is publishing an oversight of all DfE external personal-level data sharing to date and will continue to update this publication regularly. In view of this reassurance, I would be grateful if the noble Earl did not press his amendment.
My thanks to the noble Lord the Minister. The wonderful thing about having the Minister’s name to your amendments is that he has answered all the difficult questions. I thank everyone who spoke for their very kind words, not on my behalf, but on behalf of all the people who work to protect children, online and offline. I accept noble Lords’ thanks. It is very moving.
I would like to say three things. I was overwhelmed this weekend when the news broke that we had come to terms on the amendment. I received many emails from other parts of the world. To those who said in the debate that this may be a first step not just for us, but for the world—or at least in Europe—all indications are in that direction. I also reassure everyone that this amendment in no way threatened adequacy. Officials and I have been through this issue at great length. We had the kind and generous advice of Jonathan Swift QC, who is a great expert in this matter. We are quite sure of adequacy. On the question of enforcement globally, this is a challenge not for this amendment alone, but one for the Bill as a whole.
In the meantime, I look forward to working with the Government and others to make sure this is a meaningful first step to creating a digital world in which children can thrive. I beg leave to withdraw Amendment 7, and note that the other amendments will be moved as they appear.
Amendment 7 withdrawn.
Clause 9: Special categories of personal data and criminal convictions etc data
8: Clause 9, page 5, line 37, at end insert—
“( ) The processing of biometric data meets the requirements of Article 9(4) of the GDPR for authorisation by the law of the United Kingdom or part of the United Kingdom only if it meets the condition in paragraph 11A of Part 2 of Schedule 1.”
My Lords, in moving Amendment 8 I will speak to Amendment 21. I will be a little longer than perhaps those waiting on their dinner would like. I apologise for that, but this is an important set of amendments for those wishing to make use of new technologies using biometrics.
In Committee the Minister focused on the use of biometrics in a clear context, such as using a fingerprint to unlock a mobile device. In that context he may be correct to say that the enabling of this security feature by the user constitutes consent—although without a record of the consent it would still fall short of GDPR requirements. However, the contexts I was aiming to cover are those where the biometric data processing is an integral part of a service or feature, and the service or feature simply will not function without it.
Other contexts I was looking to cover include where banks decide to use biometric technology as extra security when you phone up or access your account online. Some banks offer this as an option, but it is not hard to envisage this becoming a requirement as banks are expected to do more to protect account access. If it is a mandatory requirement, consent is not appropriate—nor would it be valid. HMRC has begun to use voice recognition so that people will not have to go through all the usual security questions. If HMRC does this after 25 May 2018 it could be unlawful.
This is certainly the case with biometric access systems at employment premises. It is also the case where biometrics are used in schools and nurseries, such as for access controls and identifying who is picking up a child. In schools, biometrics are sometimes used to determine entitlements, such as free meals, in a way that does not identify or risk stigmatising those who receive them, and avoids children having to remember swipe cards or carry money.
In these contexts, providing an alternative system that does not use biometrics would probably undermine the security and other reasons for having biometrics in the first place. Without any specific lawful basis for biometric data, organisations will rely entirely on the Government, the ICO and the courts, accepting that their uses fall within the fraud prevention/substantial public interest lawful bases and within the definition of “scientific research”.
The amendments are designed to meet all these objections. In particular, the research elements of the amendments replicate the research exemption in Section 33 the Data Protection Act 1998. The effect of this exemption is that organisations processing personal data for research purposes are exempt from certain provisions of the Act, provided that they meet certain conditions. The key conditions are that the data is not used to support measures or decisions about specific individuals and that there is no substantial damage or distress caused by the processing.
In this context—I am afraid this is the reason for taking rather longer than I had hoped—it is important to place on the record a response to a number of points made in the Minister’s letter of 5 December to me about biometric data. First, he said:
“As you are aware, the General Data Protection Regulation … regards biometric data as a ‘special category’ of data due to its sensitivity”.
This is precisely why the amendment is needed. The change in status risks current lawful processing becoming unlawful. This type of data is being processed now using conditions for processing that will no longer be available once it becomes sensitive data.
Secondly, the Minister said:
“In order to process such data, a controller must satisfy a processing condition in Article 9 of the GDPR. The most straightforward route to ensure that processing of such data was lawful would be to seek the explicit consent of the data subject”.
But consent is not the most straightforward route. Consent under GDPR has to meet certain conditions to be valid, which sets the bar high. It is valid only where it is a genuine choice, and many aspects of business processing are not a choice for individuals.
Thirdly, the Minister said:
“You explained that many businesses, such as banks, already make use of biometric identification verification mechanisms. Some devices, such as computers and smartphones, allow users to access the service using fingerprint recognition, for example”.
This statement confuses two things: the use of fingerprint or other biometrics as a way to access devices such as smartphones or laptops, which is an optional feature provided by the manufacturer; and the use of biometrics by service providers such as banks to allow customers to access their accounts.
Some service providers piggyback on the smartphone or laptop functionality to offer this as an option to customers if they have already enabled it for access to the device. However, banks use biometrics, such as voice recognition, for account access over the phone so that their customer services team can authenticate you, which is unrelated to biometrics used to access a device.
Fourthly, the Minister said:
“Generally speaking, such mechanisms are offered as an alternative to more conventional forms of access, such as use of passwords, and service providers should therefore have no difficulty in seeking the data subjects’ GDPR-compliant consent”.
This is the case as long as you believe that service providers can continue to make it optional. We are already seeing a move to replace passwords with biometrics, given their weakness and the increase in the number of data hacks that compromise passwords. The more user names and passwords are compromised, the more we will demand that banks do more to prevent unauthorised access, so it would seem sensible to future-proof data protection law so that banks can seamlessly move to a system where biometric access for certain services or transactions is mandatory.
Fifthly, the Minister said:
“However, you explained that problems may arise for example when employers or schools decide to secure access to their premises with ID verification mechanisms and employees and pupils do not have real choice about whether to use the devices”—
I am sorry. I am just finding the right place in my notes.
I may have to add later to what I have said, which I think the Minister will find totally unpalatable. I will try to move on.
The Minister also said:
“You are concerned that if consent is not a genuine option in these situations and there are no specific processing conditions in the Bill to cover this on grounds of substantial public interest. Processing in these circumstances would be unlawful. To make their consent GDPR compliant, an employer or school must provide a reasonable alternative that achieves the same ends, for example, offering ‘manual’ entry by way of a reception desk”.
Consent is rarely valid in an employment context. If an employer believes that certain premises require higher levels of security, and that biometric access controls are a necessary and proportionate solution, it cannot be optional with alternative mechanisms that are less secure, as that undermines the security reasons for needing the higher levels of security in the first place: for example, where an employer secures a specific office or where the staff are working on highly sensitive or confidential matters, or where the employer secures a specific room in an office, such as a server room, where only a small number of people can have access and the access needs to be more secure.
Biometrics are unique to each person. A pass card can easily be lost or passed to someone else. It is not feasible or practical to insist that organisations employ extra staff for each secure office or secure room to act as security guards to manually let people in.
The Minister further stated:
“You also queried whether researchers involved in improving the reliability or ID verification mechanisms would be permitted to carry on their work under the GDPR and the Bill. Article 89(1) of the GDPR provides that processing of special categories of data is permitted for scientific research purposes, providing that appropriate technical and organisational safeguards are put in place to keep the data safe. Article 89(1) is supplemented by the safeguards of clause 18 of the Bill. For the purposes of GDPR, ‘scientific research’ has a broad meaning. When taken together with the obvious possibility of consent-based research, we are confident that the Bill allows for the general type of testing you have described”.
It is good to hear that the Government interpret the research provisions as being broad enough to accommodate the research and development described. However, for organisations to use these provisions with confidence, they need to know whether the ICO and courts will take the same broad view.
There are other amendments which would broaden the understanding of the research definition, which no doubt the Minister will speak to and which the Government could support to leave no room for doubt for organisations. However, it is inaccurate to assume that all R&D will be consent based; in fact, very little of it will be. Given the need for consent to be a genuine choice to be valid, organisations can rarely rely on this as they need a minimum amount of reliable data for R&D that presents a representative sample for whatever they are doing. That is undermined by allowing individuals to opt in and out whenever they choose. In particular, for machine learning and AI, there is a danger of discrimination and bias if R&D has incomplete datasets and data that does not accurately represent the population. There have already been cases of poor facial recognition programmes in other parts of the world that do not recognise certain races because the input data did not contain sufficient samples of that particular ethnicity with which to train the model.
This is even more the case where the biometric data for research and development is for the purpose of improving systems to improve security. Those employing security and fraud prevention measures have constantly to evaluate and improve their systems to stay one step ahead of those with malicious intent. The data required for this needs to be guaranteed and not left to chance by allowing individuals to choose. The research and development to improve the system is an integral aspect of providing the system in the first place.
I hope that the Minister recognises some of those statements that he made in his letter and will be able, at least to some degree, to respond to the points that I have made. There has been some toing and froing, so I think that he is pretty well aware of the points being raised. Even if he cannot accept these amendments, I hope that he can at least indicate that biometrics is the subject of live attention within his department and that work will be ongoing to find a solution to some of the issues that I have raised. I beg to move.
My Lords, I wonder whether I might use this opportunity to ask a very short question regarding the definition of biometric data and, in doing so, support my noble friend. The definition in Clause 188 is the same as in the GDPR and includes reference to “behavioural characteristics”. It states that,
“‘biometric data’ means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of an individual, which allows or confirms the unique identification of that individual, such as facial images or dactyloscopic data”.
“There’s no art
To find the mind’s construction in the face”.
How do behavioural characteristics work in this context? The Minister may not want to reply to that now, but I would be grateful for an answer at some point.
My Lords, I thank the noble Lord, Lord Clement-Jones, for engaging constructively on this subject since we discussed it in Committee. I know that he is keen for data controllers to have clarity on the circumstances in which the processing of biometric data would be lawful. I recognise that the points he makes are of the moment: my department is aware of these issues and will keep an eye on them, even though we do not want to accept his amendments today.
To reiterate some of the points I made in my letter so generously quoted by the noble Lord, the GDPR regards biometric data as a “special category” of data due to its sensitivity. In order to process such data, a data controller must satisfy a processing condition in Article 9 of the GDPR. The most straightforward route to ensure that processing of such data is lawful is to seek the explicit consent of the data subject. However, the GDPR acknowledges that there might be occasions where consent is not possible. Schedule 1 to the Bill makes provision for a range of issues of substantial public interest: for example, paragraph 8, which permits processing such as the prevention or detection of an unlawful act. My letter to noble Lords following day two in Committee went into more detail on this point.
The noble Lord covered much of what I am going to say about businesses such as banks making use of biometric identification verification mechanisms. Generally speaking, such mechanisms are offered as an alternative to more conventional forms of access, such as use of passwords, and service providers should have no difficulty in seeking the data subject’s free and informed consent, but I take the point that obtaining proper, GDPR-compliant consent is more difficult when, for example, the controller is the data subject’s employer. I have considered this issue carefully following our discussion in Committee, but I remain of the view that there is not yet a compelling case to add new exemptions for controllers who wish to process sensitive biometric data without the consent of data subjects. The Bill and the GDPR make consent pre-eminent wherever possible. If that means employers who wish to install biometric systems have to ensure that they also offer a reasonable alternative to those who do not want their biometric data to be held on file, then so be it.
There is legislative precedent for this principle. Section 26 of the Protection of Freedoms Act 2012 requires state schools to seek parental consent before processing biometric data and to provide a reasonable alternative mechanism if consent is not given or is withdrawn. I might refer the noble Lord to any number of speeches given by members of his own party—the noble Baroness, Lady Hamwee, for example—on the importance of those provisions. After all, imposing a legislative requirement for consent was a 2010 Liberal Democrat manifesto commitment. The GDPR merely extends that principle to bodies other than schools. The noble Lord might respond that his amendment’s proposed subsection (1) is intended to permit processing only in a tight set of circumstances where processing of biometric data is undertaken out of necessity. To which I would ask: when is it genuinely necessary to secure premises or authenticate individuals using biometrics, rather than just cheaper or more convenient?
We also have very significant concerns with the noble Lord’s subsections (4) and (5), which seek to drive a coach and horses through fundamental provisions of the GDPR—purpose limitation and storage limitation, in particular. The GDPR does not in fact allow member states to derogate from article 5(1)(e), so subsection (5) would represent a clear breach of European law.
For completeness, I should also mention concerns raised about whether researchers involved in improving the reliability of ID verification mechanisms would be permitted to carry on their work under the GDPR and the Bill. I reassure noble Lords, as I did in Committee, that article 89(1) of the GDPR provides that processing of special categories of data is permitted for scientific research purposes, providing appropriate technical and organisational safeguards are put in place to keep the data safe. Article 89(1) is supplemented by the safeguards in Clause 18 of the Bill. Whatever your opinion of recitals and their ultimate resting place, recital 159 is clear that the term “scientific research” should be interpreted,
“in a broad manner including for example technological development and demonstration”.
This is a fast-moving area where the use of such technology is likely to increase over the next few years, so I take the point of the noble Lord, Lord Clement-Jones, that this is an area that needs to be watched. That is partly why Clause 9(6) provides a delegated power to add further processing conditions in the substantial public interest if new technologies, or applications of existing technologies, emerge. That would allow us to make any changes that are needed in the future, following further consultation with the parties that are likely to be affected by the proposals, both data controllers and, importantly, data subjects whose sensitive personal data is at stake. For those reasons, I hope the noble Lord is persuaded that there are good reasons for not proceeding with his amendment at the moment.
The noble Baroness, Lady Hamwee, asked about behavioural issues. I had hoped that I might get some inspiration, but I fear I have not, so I will get back to her and explain all about behavioural characteristics.
My Lords, I realise that, ahead of the dinner break business, the House is agog at details of the Data Protection Bill, so I will not prolong the matter. The Minister said that things are fast-moving, but I do not think the Government are moving at the pace of the slowest in the convoy on this issue. We are already here. The Minister says it is right that we should have alternatives, but for a lab that wants facial recognition techniques, having alternatives is just not practical. The Government are going to have to rethink this, particularly in the employment area. As more and more banks require it as part of their identification techniques, it will become of great importance.
We are just around the corner from these things, so I urge the Minister, during the passage of the Bill, to look again at whether there are at least some obvious issues that could be dealt with. I accept that some areas may be equivocal at this point, only we are not really talking about the future but the present. I understand what the Minister says and I will read his remarks very carefully, as no doubt will the industry that increasingly uses and wants to use biometrics. In the meantime, I beg leave to withdraw the amendment.
Amendment 8 withdrawn.
Consideration on Report adjourned until not before 8.48 pm.