Skip to main content

Data (Use and Access) Bill [HL]

Volume 841: debated on Tuesday 19 November 2024

Second Reading

Moved by

That the Bill be now read a second time.

Relevant document: 3rd Report from the Constitution Committee. Scottish, Welsh and Northern Ireland Legislative Consent sought.

My Lords, data is the DNA of modern life. It is integral to almost every aspect of our society and economy, from NHS treatments and bank transactions to social interactions. An estimated 85% of UK businesses handle some form of digital data, and the UK data economy was estimated to represent 6.9% of UK GDP. Data-enabled UK service exports accounted for 85% of total service exports, estimated to be worth £259 billion, but data use in the UK drives productivity benefits of around 0.12%, which is only one minute per worker per day.

We can do much more to drive productivity through data. That is why the Government are presenting the Data (Use and Access) Bill today, to harness the power of data to drive economic growth, support modern digital government and improve people’s lives. The Bill is forecast to generate £10 billion over 10 years, to underpin the Prime Minister’s missions and to fulfil several manifesto commitments; most importantly, it will help everyday processes for people, business and our public services.

The Bill has eight parts, which I will speak to in order. Before I start, I recognise that noble Lords have debated data legislation over a number of years, and many measures in the Bill will be familiar to them, as they are to me. I pay particular tribute to the noble Viscount, Lord Camrose, for his work on these measures in the past. That said, the Government and I have carefully considered the measures to be taken forward in this Bill, and noble Lords will notice several important changes that make the Bill more focused, more balanced and better able to achieve its objectives.

The first three parts are focused on growing the economy. First, we will create the right conditions to set up future smart data schemes. These models allow consumers and businesses to safely share information about themselves with authorised third parties, which can then in turn offer innovative uses, such as personalised market comparisons and financial advice. This measure, which is also a manifesto commitment, will cut costs, give people greater consumer choice and deliver economic benefit. In September this year, more than 11 million people—one in six of the UK population—were already making use of open banking services.

In Part 2, the Bill will legislate on digital verification services, meaning that organisations will be able to receive a trust mark if they are approved as meeting the stringent requirements in the trust framework and appear on the government register. As well as increasing trust in the market, these efficiency gains are expected to boost the UK economy by £4.3 billion over the next decade by doing things such as reducing the time spent completing checks to hire new workers from days to minutes.

Part 3, on the national underground asset register, or NUAR, will place this comprehensive digital map of the underground pipes and cables on a statutory footing. The measures mandate that owners of underground infrastructure, such as water companies or telecoms operators, register their assets on NUAR. This will deliver more than £400 million per year through more efficient data sharing, reduced accidents and delays, and improved worker safety. The proposed measures will also allow this data to be used for additional prescribed use cases, such as improved street work co-ordination, where commercial and national security considerations allow.

Part 4 relates to the format of the registers of births and deaths, allowing for the first time the possibility of digital registration.

Part 5 is specifically about data protection and privacy, although I stress that this Government are committed to the strongest data privacy protections throughout the Bill. This part of the Bill is the one that the Government and I have most thoroughly revisited. Our objective has been to address the current lack of clarity that impedes the safe development and responsible deployment of new technologies.

We have removed previous measures watering down the accountability framework, along with other measures that risked protections. Since the Bill’s introduction I have spoken to members of industry, civil society and the research community about this, as well as some noble Lords here today, and I am glad to note that these changes have been broadly welcomed. In this context, I would like to say something about AI, which will undoubtedly have a vital role to play in growing the UK’s economy and transforming its public services. This will include the responsible and safe use of solely automated decision-making. However, the rules in Article 22 of the UK GDPR are unclear, which holds us back. Organisations are not confident about when they can make solely automated decisions, nor about what safeguards apply and when. We suffer when this leads to hollow attempts at token human involvement to try to move the goalposts.

The Bill will fix these issues. It writes the safeguards much more clearly. You will have the right to be told about a decision, the right to human intervention, and the right to make representations about it. It specifically provides that human involvement must be meaningful or else it does not count. This—alongside clearer safeguards, the restored accountability framework, and a modernised information commission—will help us strike the right balance between the benefits of this technology being available in more circumstances, and public trust and protection.

Part 6 is on the regulator: the new information commission. This is a new-look regulator—modernised, with clear strategic direction and stronger powers, and still independent. We will bring the information commission in line with regulatory best practice, increase accountability, and enable greater transparency for organisations and the public. It will be empowered to engage effectively with the increasingly complex opportunities and challenges we see in the use of personal data, as well as to ensure high data protection standards and increased public trust.

The Government have worked closely with the ICO on these reforms, and the commissioner noted in his response to the Bill that these changes

“will significantly improve the ICO’s ability to function effectively”

and the

“refreshed governance arrangements will maintain our independence and enhance our accountability”.

Part 7 includes other provisions about the use of or access to data. Clauses on NHS information standards will create consistency across IT systems to enable data sharing. This is a positive step in driving up efficiency in our NHS and will save 140,000 hours of staff time a year. These measures will also improve patient safety; for example, by allowing authorised medical staff to access patient data to provide care in emergencies.

There is a new, fairly technical measure on smart meters, which will provide the Gas and Electricity Markets Authority with flexibility to determine the best process to follow in appointing the successor smart meter communication licensee. These clauses will ensure that the authority is able to appoint a successor in a timely and efficient way that is in the best interests of energy consumers.

Part 7 also includes measures on online safety research, laying the groundwork for crucial research into online harms to help us learn and adapt, to keep the internet safe. This is in addition to measures on data preservation notices to help coroners, or procurators fiscal in Scotland, investigate how online platform use may have had a contributing effect in the tragic death of a child. I thank the noble Lord, Lord Bethell, and the noble Baroness, Lady Kidron, for their campaigning on these important issues, which we supported in opposition. I am pleased to be able to deliver these measures early in the new Parliament.

Finally, Part 8 includes standard final provisions.

As noble Lords can probably tell from the length of that list, this is quite a wide-ranging Bill. However, I hope they will agree that the focus—on growing the economy, supporting modern, digital government, and improving lives—is a lot clearer. In summary, I have three main points to encourage the swift passage of the Bill through the House.

First, I have worked very closely with noble Lords across the House on a number of these measures over the years. I am glad to have been able to make the necessary changes to the legislation in response to our shared concerns. Secondly, we are very keen to implement these changes as soon as possible for our stakeholders—the ICO, business, and the research community, to name but a few—which have all been waiting patiently to see the benefits these reforms will bring. Thirdly and most importantly, the measures in the Bill will make a material, positive difference to people’s lives.

I hope noble Lords will work with me to pass the Bill and ensure that these reforms can bring real benefits to our economy and public services and the UK public. I beg to move.

My Lords, I welcome the opportunity to speak on this important matter. I am especially grateful to the noble Baroness, Lady Jones of Whitchurch, for her engagement so far with me and my noble friend Lord Camrose, which has been truly helpful with this technically complex Bill. I thank in advance the other speakers and keenly look forward to hearing their—I hope—lucid and insightful commentary on the content of the Bill.

This is a wide-ranging Bill which affects a wide range of policy issues. If well executed, it could bring substantial benefits to individuals and businesses, much like the previous Conservative Government’s Bill, which was so ably championed by my noble friend Lord Camrose. However, if poorly executed, the Bill may result in data vulnerabilities for both individuals and the country as a whole.

We on these Benches are delighted that the Government are taking forward the bulk of the provisions and concepts set out by the previous Conservative Government, in particular the introduction of a national underground asset register, which will make construction and repairs more efficient, reduce the cost and inconvenience of roadworks and, most importantly, make work safer for construction workers; giving Ofcom the ability, when notified by the coroner, to demand that online service providers retain data in the event of a child’s death; reforming and modernising the Information Commissioner’s Office; introducing a centralised digital ID verification framework; allowing law enforcement bodies to make greater use of biometric data for counterterrorism purposes; and—particularly close to my heart—setting data standards in areas such as health to allow the sharing of data and its use for research and AI. All these provisions are necessary and will provide tangible benefits to the people of the UK. Indeed, the data economy is crucial. As the Government have rightly said, it has the potential to add billions in value to the UK economy through opportunities and efficiencies.

Therefore, noble Lords will find in us a largely supportive Opposition on this Bill, working constructively to get it through Parliament. However, we on these Benches have some concerns about the Bill, and I am keen to hear the Minister’s response on a number of points.

Many small and medium-sized enterprises control low-risk personal data. Although they must of course take careful measures to manage customer data, many simply do not have the resources to hire or train a data protection officer, particularly with the Government’s recent decision to increase burdens on employer NI. We have concerns that the Bill will disproportionately add to the weight of the requirements on those businesses, without greatly advancing the cause of personal privacy. We should free those SMEs—the bedrock of our economy —from following the same demanding data protection requirements as larger, better-resourced enterprises, which carry far greater risks to personal privacy. We need to allow them to concentrate on running a profitable business rather than jumping through myriad bureaucratic data protection hoops over data that, in many cases, presents little risk to privacy. In short, we need those businesses to be wisely careful but not necessarily hyper-careful.

Many of the Bill’s clauses allow or require mass digitisation of data by the public sector, such as the registers of births and deaths. These measures will improve efficiency and, therefore, save money—something that I think we can all agree is necessary. However, the more data is digitised, the more we present a tempting attack surface to hackers—thieves who steal data for profit by sale on the dark web, ransom or both. Do the Government intend to bring forward legislation that will set out improved cybersecurity recruitments for public bodies that will, because of the Bill, rapidly digitise their datasets? Furthermore, if the Government intend to bring forward additional cybersecurity measures, when do they intend to do so? Any time lag leaves public bodies and the people’s data they control vulnerable to those with malicious intent.

Building on this point, the Bill will also see a rapid increase in the digitisation and sharing of high-risk data across the public and private sectors. There will, for example, be an increase in high-risk data sharing between the NHS and adult social care providers in communities, and a range of private sector companies handling identification data to create a digital ID. Again, the more high-risk data is used, transferred or created, the greater the incentive for hackers to target organisations and bodies. Therefore, I must ask the Minister whether and when the Government intend to bring forward additional cybersecurity measures for public bodies, large businesses, and the minority of SMEs that will handle high-risk data.

Introducing a national underground asset register, or NUAR, will lead to significant benefits for people and developers alike. It will substantially reduce the risk of striking underground infrastructure during development or repairs. This will not only speed up developments and repairs but reduce costs and the risks posed to construction workers. However, having a centralised register of all underground assets, including critical infrastructure, may result in a heightened terror risk. I know this Government, like the previous one, will have devoted considerable thought to this grave risk, and I hope the Minister will set out some of the Government’s approach to mitigating it. In short, how do they intend to ensure the security of NUAR so that there can be no possibility of unauthorised access to our critical infrastructure?

We on this Bench support the Government’s position on automated decision-making, or ADM. It can rapidly increase the speed at which commercial decisions are taken, thus resulting in an increase in sales and profit, improvements to productivity and a better customer experience. AI will be the key underlying technology of almost all ADM. The vast quantity of data and the unfathomable complexity of the algorithms mean that we have to address the AI risks of bias, unfairness, inaccuracy and loss of human agency. Therefore, I think it is wise that we consider amending this Bill to put some of the use of AI in this context on a statutory footing. I hope that the Minister will share the Government’s thoughts on this matter, and I am confident that colleagues across the House will have strong views too.

I end by outlining the opportunities for setting standards for health data. As Health Minister, I would often wax lyrical on how we have the best data in the world, with our ability to link primary and secondary care data with genomic, optical and myriad other data sources going back decades. Add to this the large heterogeneous population and you have, without doubt, the best source of health data in the world. I firmly believe that by setting the data standards we can build in the UK the foundations for a Silicon Valley for the life sciences, which would be a massive benefit to patients, the NHS and the UK economy overall.

We on this Bench largely welcome the Bill, not least because it retains many of the concepts from the previous Conservative Government’s Bill. However, there are important matters that deserve our attention. I look forward to hearing today the views of noble Lords across the House to enable the productive passage of the Bill.

My Lords, I declare my interests as chair of the 5Rights Foundation and as an adviser to the Institute for Ethics in AI at Oxford.

I start by expressing my support for the removal of some of the egregious aspects of the last Bill that we battled over, and by welcoming the inclusion of access to data for researchers—although I believe there are some details to discuss. I am extremely pleased finally to see provisions for the coroner’s access to data in cases where a child has died. On that alone, I wish the Bill swift passage.

However, a Bill being less egregious is not sufficient on a subject fundamental to the future of UK society and its prosperity. I want to use my time this afternoon to ask how the Government will not simply make available, but secure the full value of, the UK’s unique datasets; why they do not fully make the UK AI-ready; and why proposals that they did not support in opposition have been included and amendments that they did support have been left out.

We have a unique opportunity, as the noble Lord, Lord Markham, just described, with unique publicly held datasets, such as the NHS’s. At a moment at which the LLMs and LMMs that will power our global future are being built and trained, these datasets hold significant value. Just as Britain’s coal reserves fuelled global industrial transformation, our data reserves could have a significant role to play in powering the AI transformation.

However, we are already giving away access to national data assets, primarily to a handful of US-based tech companies that will make billions selling the products and services built upon them. That creates the spectre of having to buy back drugs and medical innovations that simply would have not been possible without the incalculably valuable data. Reimagining and reframing publicly held data as a sovereign asset accessed under licence, protected and managed by the Government acting as custodian on behalf of UK citizens, could provide direct financial participation for the UK in the products and services built and trained on its data. It could give UK-headquartered innovators and researchers privileged access to nationally held data sets, or to investing in small and medium-sized specialist LLMs, which we will debate later in the week. Importantly, it would not simply monetise UK data but give the UK a seat at the table when setting the conditions for use of that data. What plans do the Government have to protect and value publicly held data in a way that maximises its long-term value and the values of the UK?

Similarly, the smart data schemes in the Bill do not appear to extend the rights of individual data holders to use their data in productive and creative ways. The Minister will recall an amendment to the previous data Bill, based on the work of associate professor Reuben Binns, that sought to give individuals the ability to assign their data rights to a third party for agreed purposes. The power of data is fully realised only when it is combined. Creating communal rights for UK data subjects could create social and economic opportunities for communities and smaller challenger businesses. Again, this is a missed opportunity to support the Government’s growth agenda.

My second point is that the Bill fails to tackle present-day or anticipated uses of data by AI. My understanding is that the AI Bill is to be delayed until the Government understand the requirements of the new American Administration. That is concerning on many levels, so perhaps the Minister can say something about that when she winds up. Whatever the timing, since data is, as the Minister said, in the DNA of AI infrastructure, why does the Bill so spectacularly fail to ensure that our data laws are AI-ready? As the News Media Association says, the Bill is silent on the most pressing data policy issue of our time: namely, that the unlicensed use of data created by the media and broader creative industries by AI developers represents IP theft on a mass scale.

Meanwhile, a single-sentence petition that says,

“The unlicensed use of creative works for training generative AI is a major, unjust threat to the livelihoods of the people behind those works, and must not be permitted”,

has been signed by nearly 36,000 organisations and individuals from the creative community. This issue was the subject of a cross-party amendment to which Labour put its name, which would have put the voluntary web standards represented by the robots.txt protocol on a mandatory opt-in basis—likely only one of several amendments needed to ensure that web indexing does not become a proxy for theft. In 2022, it was estimated that the UK creative industries generated £126 billion in gross value added to the economy and employed 2.4 million people. Given their importance to our economy, our sense of identity and our soft power, why do we have a data Bill that is silent on data scraping?

In my own area of particular concern, the Bill does not address the impact of generative AI on the lives and rights of children. For example, instead of continuing to allow tech companies to use pupil data to build unproven edtech products based on drill-and-practice learning models—which in any other form is a return to Victorian rote learning but with better graphics—the Bill could and should introduce a requirement for evidence-based, pedagogically sound paradigms that support teachers and pupils. In the recently announced scheme to give edtech companies access to pupil data, I could not see details about privacy, quality assurance or how the DfE intends to benefit from these commercial ventures which could, as in my previous NHS example, end with schools or the DfE having to buy back access to products built on UK pupil data. There is a quality issue, a safety issue and an ongoing privacy issue in our schools, and yet nothing in the Bill.

The noble Baroness and I met to discuss the need to tackle AI-generated sexual abuse, so I will say only that each day that it is legal to train AI models to create child sexual abuse material brings incalculable harm. On 22 May, specialist enforcement officers and I, along with the noble Viscount, Lord Camrose, were promised that the ink was almost dry on a new criminal offence. It cannot be that what was possible on that day now needs many months of further drafting. The Government must bring forward in this Bill the offence of possessing, sharing, creating or distributing an AI file that is trained on or trained to create CSAM, because this Bill is the first possible vehicle to do so. Getting this on the books is a question of conscience.

My third and final point is that the Bill retains some of the deregulatory aspects of its predecessor, while simultaneously missing the opportunity of updating data law to be fit for today. For example, the Bill extends research exemptions in the GDPR to

“any research that can reasonably be described as scientific”,

including commercial research. The Oxford English Dictionary says that “science” is

“The systematic study of the structure and behaviour of the physical and natural world through observation, experimentation, and the testing of theories against the evidence obtained”.

Could the Minister tell the House what is excluded? If a company instructs its data scientists and computing engineers to develop a new AI system of any kind, whether a tracking app for sport or a bot for an airline, is that scientific research? If their behavioural scientists are testing children’s response to persuasive design strategies to extend the stickiness of their products, is that scientific research? If the answer to these questions is yes, then this is simply an invitation to tech companies to circumvent privacy protections at scale.

I hope the noble Baroness will forgive me for saying that it will be insufficient to suggest that this is just tidying up the recitals of the GDPR. Recital 159 was deemed so inadequate that the European Data Protection Supervisor formally published the following opinion:

“the special data protection regime for scientific research is understood to apply where … the research is carried out with the aim of growing society’s collective knowledge and wellbeing, as opposed to serving primarily one or several private interests”.

I have yet to see that the Government’s proposal reflects this critical clarification, so I ask for some reassurance and query how the Government intend to account for the fact that, by putting a recital on the face of the Bill, it changes its status.

In the interests of time, I will put on the record that I have a similar set of issues about secondary processing, recognised legitimate interests, the weakening of purpose limitation, automated decision-making protections and the Secretary of State’s power to add to the list of special category data per Clause 74. These concerns are shared variously by the ODI, the Ada Lovelace Institute, the Law Society, Big Brother Watch, Defend Digital Me, 5Rights, Connected by Data and others. Collectively, these measures look like the Government are paving a runway for tech access to the private data of UK citizens or, as the Secretary of State for DSIT suggested in his interview in the Times last Tuesday, that the Government no longer think it is possible to regulate tech giants at all.

I note the inclusion of a general duty on the ICO to consider the needs of children, but it is a poor substitute for giving children wholesale protection from any downgrading of their existing data rights and protections, especially given the unprecedented obligations on the ICO to support innovation and stimulate growth. As the Ada Lovelace Institute said,

“writing additional pro-innovation duties into the face of the law … places them on an almost equivalent footing to protecting data subjects”.

I am not sure who thinks that tech needs protection from individual data rights holders, particularly children, but unlike my earlier suggestion that we protect our sovereign data assets for the benefit of UK plc, the potential riches of these deregulation measures disproportionately accrue to Silicon Valley. Why not use the Bill to identify and fix the barriers the ICO faces in enforcing the AADC? Why not use it to extend existing children’s privacy rights into educational settings, as many have campaigned for? Why not allow data subjects more freedom to share their data in creative ways? The Data (Use and Access) Bill has little in it for citizens and children.

Finally, but by no means least importantly, is the question of the reliability of computers. At col. GC 576 of Hansard on 24 April 2024, the full tragedy of the postmasters was set out by the noble Lord, Lord Arbuthnot, who is in his place and will say more. The notion that computers are reliable has devastated the lives of postmasters wrongly accused of fraud. The Minister yesterday, in answer to a question from the noble Lord, Lord Holmes, suggested that we should all be “more sceptical” in the face of computer evidence, but scepticism is not legally binding. The previous Government agreed to find a solution, albeit not a return to 1999. If the current Government fail to accept that challenge, they must shoulder responsibility for the further miscarriages of justice which will inevitably follow. I hope the noble Baroness will not simply say that the reliability of computers and the other issues raised are not for this Bill. If they are not, why not? Labour supported them in opposition. If not, then where and how will these urgent issues be addressed?

As I said at the outset, a better Bill is not a good Bill. I question why the Government did not wait a little longer to bring forward a Bill that made the UK AI ready, understood data as critical infrastructure and valued the UK’s sovereign data assets. It could have been a Bill that did more work in reaching out to the public to get their consent and understanding of positive use cases for publicly held data, while protecting their interests—whether as IP holders, communities that want to share data for their own public good or children who continue to suffer at the hands of corporate greed. My hope is that, as we go to Committee, the Government will come forward with the missing pieces. I believe there is a much more positive and productive piece of legislation to be had.

My Lords, I remind the House of my interests, particularly in chairing the board of Century-Tech, an AI edtech company— I will have to talk to the noble Baroness, Lady Kidron, about that. I am a director of Educate Ventures Research, which is also an AI-in-education business, and Goodnotes, an AI note-taking app. It is a pleasure to follow the noble Baroness, Lady Kidron. I agreed with most of what she said, and I look forward to working with her on the passage of the Bill.

I guess we are hoping it is third time lucky with a data Bill. I am sure we will hear from all speakers that there is a sense that this is an improved Bill on the previous two attempts. It is a particular joy to see that terrible stuff around DWP data not appearing in this Bill. There is plenty that I welcome in terms of the improvements. Like most speakers, I imagine, I mostly want to talk about what might need further debate and what might be missing, rather than just congratulating my noble friend the Minister on the improvements she and her colleagues have been able to make.

I anticipate that this will not be the only Bill we have on data and AI. It would be really helpful for this Government to rediscover the joys of a White Paper. If we had a document that set out the whole story and the vision, so that we could more easily place this Bill in context, that would be really helpful. This could include where we are with the Matt Clifford action plan, and a very clear aim of data adequacy with the EU regime. I wonder whether, among all the people the Minister said she had been able to talk to about this Bill, she had also spoken to the EU to make sure we are moving in the right direction with adequacy, which has to be resolved by the summer.

Clearly, this is a Bill about data. The Minister said that data is the DNA of modern life. It has achieved a new prominence with the rollout of generative AI, which has captured everyone’s imagination—or entered their nightmares, depending on how you think about it. The Communications and Digital Committee, which I am privileged to serve on, has been thinking about that in respect of the future of news, which we will publish a report on shortly, and of scaling our businesses here in the UK. It is clear that the core ingredients you need are computing power, talent, finance and, of course, data, in order successfully to grow AI businesses here.

I agree with the noble Baroness, Lady Kidron, that we have a unique asset in our public sector datasets that the US does not have to anything like the same extent—in particular in health, but also in culture and education. It is really important that the Government have a regime, established by this legislation and any other legislation we may or may not know about, to protect and deploy that data to the public benefit and not just the private benefit, be it in large language models or other foundational models of whatever size.

It is then also important to ask, whose data is it? In my capacity as chair of a board of an AI company, I am struck by the fact that our current financial regulation does not allow us to list our data as an asset on our balance sheet. I wonder when we might be able to move in that direction, because it is clearly of some significance to these sorts of businesses. But it is also true that the data I share as a citizen, and have given consent to, should be my data. I should have the opportunity to get it back quite easily and to decide who to share it with, and it should empower me as a citizen. I should be able to hold my own data, and I definitely should not have to pay twice for it: I should not have to pay once through my taxes and then a second time by having to pay for a product that has been generated by the data that I paid for the first time. So I am also attracted to what the noble Baroness said about data as a sovereign asset.

In the same way that both Front-Bench speakers were excited about the national underground asset register, I am equally excited about the smart data provisions in the Bill, particularly in respect of the National Health Service. Unfortunately, my family have been intensive users of the National Health Service over the past year or so, and the extent to which the various elements of our NHS do not talk to each other in terms of data is a tragedy that costs lives and that we urgently need to resolve. If, as a result of this Bill, we can take the glorious way in which I can share my banking data with various platforms in order to benefit myself, and do the same with health data, that would be a really good win for us as a nation. Can the Minister reassure me that the same could be true for education? The opportunity to build digital credentials in education by using the same sort of technology that we use in open banking would also excite me.

I ask the Minister also to think about and deliver on a review of Tell Us Once, which, when I was a Minister in the DWP a long time ago, I was very happy to work on. By using Tell Us Once, on the bereavement of a relative, for example, you have to tell only one part of the public sector and that information then cascades across. That relieves you of an awful lot of difficult admin at a time of bereavement. We need a review to see how this is working and whether we can improve it, and to look at a universal service priority register for people going through bereavement in order to prioritise services that need to pass the message on.

I am concerned that we should have cross-sector open data standards and alignment with international interoperability standards. There is a danger in the Bill that the data-sharing provisions are protected within sectors, and I wonder whether we need some kind of authority to drive that.

It is important to clarify that the phrase used in the first part of the Bill, a

“person of a specified description”,

can include government departments and public bodies so that, for example, we can use those powers for smart data and net-zero initiatives. Incidentally, how will the Government ensure that the supply chains of transformers, processors, computing power and energy are in place to support AI development? How will we publish the environmental impact of that energy use for AI?

There is a lot more I could say, but time marches on. I could talk about digital verification services, direct marketing and a data consent regime, but those are all things to explore in Committee. However, there are two other things that I would briefly like to say before winding up. First, I have spoken before in this House about the number of people who are hired, managed and fired by AI automated decision-making. I fear that, under the Bill as drafted, those people may get a general explanation of how the automated decision-making algorithms are working, when in those circumstances they need a much more personalised explanation of why they have been impacted in this way. What is it about you, your socioeconomic status and the profile that has caused the decision to go the way it has?

Secondly, I am very interested in the role of the Digital Regulation Cooperation Forum in preventing abuse and finding regulatory gaps. I wonder whether, after the perennial calls in this Chamber when debating Bills such as this for a permanent Committee of both Houses to monitor digital regulation, the new Government have a view on that. I know that that is a matter for the usual channels and not Ministers, but it is a really important thing for this House to move on. I am fairly bored with making the case over the past two or three years.

In summary, this is a good Bill but it is a long Bill, and there is lots to do. I wish the Minister good luck with it.

My Lords, I, too, welcome the Bill, but there is one matter we should have at the forefront of our minds as we work through it: that it must be implemented and carried through by SMEs and individuals. Regrettably—and I say this as a lawyer—lawyers have become far too expensive. We must appreciate the need to draft legislation and regulatory regimes that are as easy as possible to operate without the benefit of legal advice. If we cannot achieve that, it must be incumbent on the Government and the regulators to set out clearly what the position is, in a way that people can understand. We do not want our SMEs and individual traders to enter into operating under this new regime without being able to understand the law. I fear that this Bill, by its very length, is a good example of how we can overcomplicate things.

The second issue is the protection and transferability of data. The Minister, the noble Lord, Lord Markham, and the noble Baroness, Lady Kidron, have all spoken about the importance and value of data, its transferability and the need to balance correctly the protections and rights of the individual against the importance of being able to use it in research. I want to say a word about the contrasting positions we face in the transferability of data between us and the European Union, and the slightly more difficult and unpredictable situation that may arise between us and the United States. They are the same problem, but they may need addressing in different ways. On the first, I need to be slightly technical, but as the adequacy of our data regime is such an important issue, I hope that noble Lords will forgive me.

I am going to ask the Minister a question, but it is not for answer today; I think it will require a bit more than that. It takes us back to the battles and debates we have had over the last six years in relation to the manner of our withdrawal from the European Union. When we left the EU, we left in place retained EU law. We got rid of the charter, because it was said that all that mattered and was important was embodied in retained EU law. That was almost certainly right, but the problem that I believe has arisen—it is partly complicated by advice contained in the Government’s human rights memorandum attached to the Bill—arises from the effect of the Retained EU Law (Revocation and Reform) Act. I can hear, almost visibly, the sighs—“Are we back to that again?”—and I am so sorry to be dredging this up.

I have looked at various things—I am particularly grateful for the help I have had from Eleonor Duhs of Bates Wells—and I believe there is a problem we need to address. As data adequacy is so important, I will say a word about the detail. At the moment, I think we proceed on the assumption that the UK GDPR, with its numerous references to the data subject’s rights and freedoms, is adequate. The last Government, when dealing with the matter, passed the Data Protection (Fundamental Rights and Freedoms) (Amendment) Regulations, which said that all the many references in the UK GDPR to these rights are to be read as referring to

“the Convention rights within the meaning of the Human Rights Act”.

The difficulty that has arisen is in paragraph 47 of the Government’s human rights memorandum:

“Where processing is conducted by a public authority and engages a right under the ECHR, that authority must, in accordance with section 6 of the Human Rights Act 1998, ensure that such processing is not incompatible with a convention right”.

Then comes the important sentence:

“Where processing is conducted by a private body, that processing will not usually engage convention rights”.

The important point is that it is generally understood that, save in specific circumstances, the Human Rights Act applies only to state entities and not to private companies. If and where data is being processed by private entities, as the Bill and the market largely envisage, how are we to be sure that our references in the UK GDPR refer to the human rights convention but not to the charter? Having lost EU retained law, how are data privacy and data protections protected when processed by private companies?

I raise this point because it is important that we clarify it. If there is an issue, and I hope the Government will look at this carefully, we will need to amend the Bill to make sure that there can be no doubt that, where data is processed by private companies, the data rights are properly protected as they would have been if we had retained EU law, or if the charter applied. It is a very narrow point but one of fundamental importance as to the Human Rights Act being directed at state actors, by and large, and not private entities. I am sorry to take up a little time on this very general subject, but data protection is so important, and retaining our data adequacy status is, as I have learned over many years, essential to our industry.

We know that, provided we can get our law in order, there is no problem as regards the EU, I hope. We face a much more difficult problem with regard to data dealings with the United States. First, the law is much more complicated and developing at an enormous pace. It is partly federal and partly state. Of course, we have no idea—and I am not going to speculate, because speculation is pointless—what may happen under the new Administration in the United States. One thing we have learned from the EU, particularly the EU AI Act, is that legislating in terms that are hard can produce results that very quickly get out of date. It seems to me that we have to look constructively at finding a way to adapt our legislative framework to what happens in the United States as regards transferability and, more importantly, the protection of our data in respect of the very large American companies. How are we to do this? Do we give Ministers very broad statutory powers? There may, I regret to say, be a case for doing that. It is something that I do not favour. If Ministers are to have such broad statutory powers, how is that power to be made properly accountable to this House?

As the noble Baroness, Lady Kidron, demonstrated, there is no use delaying these decisions until we know what the US regime may be. Maybe the US regime, unlike the EU, will change very rapidly. Bureaucracy has some advantages when you are dealing with it from the outside, but someone who believes in constant change and turmoil is much more difficult to deal with from our legislative point of view. It is a very important aspect of this legislation that we look at how, in the transnational market in data, which is of immense value and importance to us, we protect the British public.

There are loads of other points that one could raise, but I will raise only one, to follow what has just been said. It is of fundamental importance that we examine automated decision-making with the greatest care. Some very good principles have been developed both in the United States, under the current regime, and in Europe. When a decision is made by a machine—that is a rather facile way of describing it; it is made as a result of an algorithmic process—how do we ensure that, first, there is some right to a human intervention and, secondly, and equally importantly, that the person affected understands why the decision has been made? The point that has just been made is very important, because when you get a decision from an individual, you normally have it accompanied by an understanding of the human, plus reasons. This is a very important part of the Bill; it is so important to give confidence about the way forward.

There are many other detailed points, but those are the three principal points I wanted to make. Let us keep it simple, look at the transnational aspects and look at automated decision-making.

My Lords, it is always rather daunting following the noble and learned Lord, Lord Thomas. I think the safest thing to say, which has the added benefit of being true, is that I agreed with him.

I declare my interests as set out in the register, in particular that I am a member of the Horizon Compensation Advisory Board and the chair of the advisory panel of Thales UK, which makes the British passport. As the noble Lord, Lord Knight, said, this Bill is a wonderful opportunity to talk about everything that is not in it and to discuss further measures that could be included. The noble Baroness, Lady Kidron, mentioned the amendment she moved on 24 April to the predecessor Bill, designed to deal with the presumption that computer evidence is reliable, despite the fact that we all know that it is not. We shall need to come to that presumption in Committee.

I supported the amendment from the noble Baroness in Committee earlier this year, although I accept—as I think she does—that simply returning to the position as it was in 1999, before the presumption existed, may not be the best solution. We need some method, for example, of accepting that breathalysers, on the whole, work as they are intended to do, and that emails, on the whole, work as they are intended to do, and we should not have to get a certificate of accuracy from Microsoft before every trial.

The need to find a solution to the problems so brutally exposed by the Post Office scandal is urgent. In the Post Office cases, in essence, the proper burden of proof was reversed and hearsay evidence that was false was accepted as gospel truth. As a result of Horizon and the appalling human behaviour that accompanied it, the lives of hundreds, perhaps thousands, of postmasters were ruined and the UK and Fujitsu are going to have to pay billions in compensation. So this matter is urgent.

The solution may be, as Alistair Kelman has recommended, a set of seven or so detailed statements setting out what has been done to ensure that the computer evidence is reliable. It may be, as a senior and highly respected judge has recommended, a division between complex cases, such as those involving a system such as Horizon, and simple cases, such as those involving breathalysers and emails, with more stringent evidentiary requirements for the complex cases. It may be, as Professor Steven Murdoch has suggested, that documents designed to test the reliability of computer systems should be made available to the other side and be subject to challenge. It may be something else, but this Bill is an opportunity to consider, examine and test those solutions, and another such opportunity may not come along quickly. I repeat: this matter is urgent.

On a different matter, Part 2 of the Bill establishes a regulatory framework for the provision of digital verification services in the UK. We need to be clear that having a clear and verifiable digital identity is a completely different matter from going down the route of identity cards. This is not an identity card Bill. It is an essential method of establishing, if you want or need to have a digital identity, that you are who you say you are and you have the attributes that you say you have. It is a way of establishing relevant facts about yourself without having to produce gas bills. I do not know about other noble Lords, but I find producing gas bills rather tricky now that they are almost all online.

Sometimes the fact you need to establish will be age: to establish that you are allowed to drink or to drive, or that you are still alive, or whatever. Sometimes it will be your address; sometimes it will be your sex. We do not want men going to women’s prisons, nor men who identify as women working in rape crisis centres. Sex is an issue on which it is necessary to have some degree of factual clarity in those circumstances where it matters. The Bill, again, is an opportunity to ensure that this factual clarity exists in the register of births. It will then be for the individual to decide whether to share the information about their sex, age, or whatever.

An organisation called Sex Matters—I am grateful for the briefing—issued a report yesterday pointing out that, at the moment, data verification services are not authoritative, in that they allow people to change their official records to show them as the opposite sex on request. One consequence is that, for example, transgender people risk being flagged up as a synthetic identity risk and excluded, for example, from banking or travel. Another is that illnesses may be misdiagnosed or that medical risks may fail to be identified.

So this Bill is a rare opportunity to put right some things that currently need to be addressed. Those of us speaking today have received a number of helpful briefings from organisations interested in various issues: I have mentioned only a couple. I hope we will take the opportunity given to us by the Bill to take on board several of those proposals.

My Lords, it is a feature of your Lordships’ House that certain topics and Bills within them tend to attract a small and very intense group of persons, who get together to talk a language that is not generally understood by the rest of the world—certainly not by the rest of this House—and get down to business with an enthusiasm and attitude which is very refreshing. I am seeing smiles from the other side of the House. This is not meant to be in any way a single-party point—just a very nice example of the way in which the House can operate.

I have already been struck today, as I am sure have others in the group that I am talking about—who know who they are—by the recognition that we have perhaps been a little narrow in our thinking. A couple of the speeches today have brought a new thought and a new sense of engagement with this particular subject and the others we deal with. We need to be aware of that, and I am very grateful to those noble Lords. In addition, I am grateful to the repeating by the noble Lord, Lord Knight, of the speeches he had to make in 2018 and subsequent dates, and also the wonderfully grumpy speech from the noble Baroness, Lady Kidron. We have also got to take into account what we got wrong on joining the European market—which I certainly look forward to. It is a serious point.

I am also very grateful to my noble friend the Minister for setting out the new Government’s vision for data protection, for her letters—which have been very useful—and for her help in setting up the meeting I had with her officials, which I found very useful indeed. Our Minister has done a really good job in getting the Bill ready so quickly. It is good that some of the more egregious measures included in the previous Bill—particularly the changes on direct marketing during elections and the extensive access to bank account details—have gone. There are indeed some good extras as well.

We have already had some excellent speeches setting out some concerns. I have one major concern about the current Bill and three rather lesser issues which I suspect will need further debate and discussion in Committee. I will cover them quite briefly. My major concern is that, although the Bill has the intention to boost growth and productivity, and also makes a valiant attempt to provide a unified set of rules and regulations on data processing, it may in the process have weakened the protections that we want to see here in the exploitation of personal data. Data, as other noble Lords have said, is of course not just for growth and prosperity. There will be, as we have heard, clear, practical benefits in making data work for the wider social good and for the empowerment of working people. There is huge potential for data to revitalise the public services. Indeed, I liked the point made by the noble Lord, Lord Knight, that data is in some way an asset missing from the balance sheet on many operations, and we need to think carefully about how best we can configure that to make sure that the reality comes to life.

There has been, of course, a huge change. We have moved into the age of AI, but we do not have the Bill in front of us that will deal with that. The GDPR needs a top-to-toe revision so that we can properly regulate data capture, data storage, and how it may be best shared in the public interest. As an example of that, following the Online Safety Act we have a new regulator in Ofcom with the power to regulate technology providers and their algorithmic impacts. The Digital Markets, Competition and Consumers Act has given the Competition and Markets Authority new and innovative powers to regulate commercial interests, which we heard about yesterday at an all-party group. However, this Bill has missed the opportunity to strengthen the role of the ICO so we can provide a third leg capable of regulating the use of data in today’s AI-dominated world. This is a gap that we need to think very carefully about.

I hope my noble friend the Minister will acknowledge that there is a long way to go if this legislation is to earn public confidence and if our data protection regime is to work not just for the tech monopolies but for small businesses, consumers, workers and democracy. We must end the confusion, empower the regulators, and in turn empower Parliament.

There are three specific issues, and I will go through them relatively quickly. The first is on Clauses 67 and 68, already referred to, where the Bill brings in wording from Recital 159 of the GDPR—as we inherited it from the EU. This sets out how the processing of personal data for scientific research purposes should be interpreted. The recital is drafted in extraordinarily broad terms, including

“technological development and demonstration, fundamental research, applied research and privately funded research”.

It specifically mentions that:

“Scientific research purposes should also include studies conducted in the public interest in the area of public health”.

The latest ICO guidance, which contains a couple of references to commercial scientific research, says that such research

“can also include research carried out in commercial settings, and technological development, innovation and demonstration”.

However, we lack a definition, and it is rather curious that the definition of research does exist elsewhere in statute in the UK laws. It is necessary in order to fund the research councils, for example. It is also part of the process of the tax code in order to get research benefits and tax benefits for research. So, we have a definition somewhere else, but somehow the Bill avoids that and tries to go down a clarification route of trying to bring forward into the current legislation that which is already the law—according to those who have drafted it—but which is of course so complicated that it cannot be understood. I think the Government’s thinking is to provide researchers with consistency, and they say very firmly that the Bill does not create any new permissions for using or reusing data for research purposes. In my meeting with officials, they were insistent that these clauses are about fine-tuning the data protection framework, making clarifications and small-scale changes but reducing uncertainties.

I agree that it is helpful to have the key provisions—currently buried, as they are, in the recitals—on the face of the Bill, and it may be that the new “reasonableness” test will give researchers greater clarity. Of course, we also retain the requirement that research must be in the public interest. But surely the issue that we need to address is whether the Bill, by incorporating new language and putting in this new “reasonableness” test, will permit changes to how data held by the NHS, including patients’ medical records, could be used and shared. It may be that the broad definition of “scientific research”, which can be “publicly or privately funded” and “commercial or non-commercial” inadvertently waters down consent protections and removes purpose-limitation safeguards. Without wishing to be too alarmist, we need to be satisfied that these changes will not instigate a seismic change in the rules currently governing NHS data.

It is relevant to note that the Government have stated in a separate way an intention to include in the next NHS 10-year plan significant changes as to how patients’ medical records are held and how NHS data is used. Launching a “national conversation” about the plans, the Secretary of State, my right honourable friend Wes Streeting MP, highlighted a desire to introduce electronic health records called “patient passports” and to work “hand in hand” with the private sector to use data to develop new treatments. He acknowledged that these plans would raise concerns about privacy and about how to get the

“best possible deal for the NHS in return”

for private sector access to NHS data. The details of this are opaque. As currently drafted, the Bill is designed to enable patient passports and sharing of data with private companies, but to my mind it does not address concerns about patient privacy or private sector access to health data. I hope we can explore that further in Committee and be reassured.

My second point concerns the unlicensed use of data created by the media and broader creative industries by developers of the large language models—this has already been referred to. UK copyright law is absolutely clear that AI developers must obtain a licence when they are text or data mining—the technique used to train AI models. The media companies have suggested that the UK Government should introduce provisions to ensure that news publishers and others can retain control over their data; that there must be significant penalties for non-compliance; and that AI developers must be transparent about what data their crawlers have “scraped” from websites—a rather unpleasant term, but that is what they say. Why are the Government not doing much more to stop what seems clearly to be theft of intellectual property on a mass scale, and if not in this Bill, what are their plans? At a meeting yesterday of the APPG which I have already referred to, it was clear that the CMA does not believe that it is the right body to enforce IP law. But if it is not, who is, and if there is a gap in regulatory powers, should this Bill not be used to ensure that the situation is ameliorated?

My third and final point is about putting into statute the previous Government’s commitments about regulating AI, as outlined in the rather good Bletchley declaration. Does my noble friend not agree that it would be at least a major statement of intent if the Bill could begin to address

“the protection of human rights, transparency and explainability, fairness, accountability, regulation, safety, appropriate human oversight, ethics, bias mitigation, privacy and data protection”?

These are all points raised in the Bletchley declaration. We will need to address the governance of AI technologies in the very near future. It does not seem wise to delay, even if the detailed approach has yet to be worked through and consulted upon. At the very least, as has been referred to, we should be picking up the points made by the Ada Lovelace Institute about: the inconsistent powers across regulators; the absence of regulators to enforce the principles such as recruitment and employment, or diffusely regulated areas of public service such as policing; the absence of developer-focused obligations; and the absence and high variability of meaningful recourse mechanisms when things go wrong, as they will.

When my noble friend Lord Knight of Weymouth opened the Second Reading of the last Government’s data protection Bill, he referred to his speech on the Second Reading during the passage of the 2018 Act—so he has been around for a while. He said:

“We need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data”.—[Official Report, 19/12/23; col. 2164.]

For me, that remains a vision that we need to realise. It concerns me that the Bill will not achieve that.

My Lords, like others, I think I am experiencing the same sense of déjà vu that has been referred to. As others said, one of the more welcome aspects of this Bill is that it is not the same as its predecessor, which was introduced by the previous Government and which was mercifully a casualty of the election. Many of us lost far too many hours of our lives on that Bill, which was, frankly, a bad one—others have called it egregious.

So, I am pleased that this Government have clearly taken account of those debates—perhaps some of those hours were not wasted after all—and have produced a slightly slimmed-down version. That, in part, is because some of the old Bill has been removed from this one, but I am afraid it is expected to reappear again; I hate to disappoint the noble Lords, Lord Knight and Lord Stevenson, but we are going to see those DWP bank account access clauses in a separate Bill. However, at least it will be a stand-alone Bill rather than tucked in the background of a two-inch-thick data Bill.

I will start with a general concern which the noble and learned Lord, Lord Thomas, mentioned, which is that of EU data adequacy, which a number of us raised in the context of the last Bill. The helpful letter from the noble Lord, Lord Ricketts, the chair of the European Affairs Committee, dated 22 October to the Secretary of State for Science, Innovation and Technology, sets out very clearly the

“significant extra costs and administrative burdens on businesses and public-sector organisations which share data between the UK and the EU”

that would be incurred if we were to lose that data adequacy ruling, which is due to expire in June 2025—so very soon. I do not think I have seen a response from the Government to that letter, so I would be very interested to hear what the Minister has to say on that. Although this Bill is clearly less contentious than its predecessor and the risk is therefore clearly lower, it is not zero risk, and we need to be careful to ensure that there is nothing in the Bill that risks significantly the loss of that ruling.

To that end, I would be grateful if the Minister could explain what assessment the Government have made of the risk of losing the EU data adequacy ruling and, perhaps more importantly, tell us the extent to which the Bill has been discussed with our European counterparts to ensure that there is nothing in it that is concerning them. Clearly, we do not need to and should not follow the letter of the EU data protection rules, but we should at least work with our EU counterparts to ensure that we are not risking the adequacy ruling.

Part 1 deals with so-called smart data. I welcome it but note that it consists mainly of a series of powers to regulate rather than any firm steps, which is a little disappointing. The only current live example of smart data that we have is open banking, which a number of noble Lords have referred to—maybe, one day, we will see a pensions dashboard; who knows? However, open banking has been rather slower to take off than had been hoped. It has been six or seven years since it was first mooted. I urge the Government to carry out a review of why that is, before they start to make the regulations that the Bill proposes around smart data. There are lessons to be learned from open banking, to ensure that what we do with smart data in the future is more successful. The claims that smart data will boost the UK economy by £10 billion over the next 10 years looks a little optimistic, especially as the impact assessment from the Department for Business and Trade accompanying the Bill fails to monetise any costs or benefits of the smart data elements. I think that the smart data concept is good but hope that we get it right.

Part 2 of the Bill deals with the digital verification services. Again, on the whole, I am supportive of this. The Bill should improve security of and trust in digital verification. As the noble Lord, Lord Arbuthnot, said, it is not about digital ID cards. However, a number of us raised a concern last time round. There is a danger that this could become a slippery slope towards a situation where people may find themselves compelled to use digital verification services and therefore excluded from accessing services or products if they are not able or willing to use digital verification. The “not willing” part of it is important. Some people are wary of putting detailed identity information online. I am increasingly wary, particularly as a resident of Dumfries and Galloway, where all medical records from NHS Dumfries & Galloway were recently hacked, stuck online for ransomware and probably published. Therefore, I have some sympathy with those who do not fully trust official systems. I am curious to hear what the Minister has to say in response to the comments from the noble Lord, Lord Markham, about increased cyber- security in the public sector, as that is a good example of where it has gone wrong.

I know that there is no intention on the part of the Government at this time to make the use of DVS compulsory, but it is quite easy to see other providers, such as estate agents, financial institutions and, as one noble Lord mentioned, employers, making it a requirement. While supportive, I think we need some protections to ensure that people are not excluded from services by that. I would be interested to hear the Minister’s thoughts.

On Part 5, the House of Lords Select Committee on the Fraud Act 2006 and Digital Fraud heard a number of times that banks and other financial institutions were unwilling to share data for fraud prevention purposes because they felt constrained by data protection rules. I suspect that they were wrong but am very pleased that data processing for the purposes of detecting, investigating or preventing crime is to be expressly included as a legitimate interest. I hope that the Information Commissioner will ensure that it is widely pointed out and that we will start to see greater co-operation between payment providers and the tech and telecoms companies where the vast bulk of frauds originate.

However, on the subject of the legitimate interest changes, I am concerned that the Secretary of State will be able to make changes to matters considered to be legitimate interests by regulation. That is a significant power in terms of data processing and potentially a retrograde step. It could also raise concerns with respect to the EU data adequacy points that I raised earlier. While the EU might be happy with what is currently proposed, the ability to change key aspects could raise alarm bells.

Other noble Lords have talked about automated decision-making, where I am also concerned about the weakening of rights. Currently, automated decision-making is broadly prohibited, with specific exceptions. This Bill would permit it in a wider set of circumstances, with fewer safeguards. In her introduction, the Minister seemed to indicate that the same safeguards would apply. As I understand it, that is the case only where special category data is used. I would be grateful if the Minister could explain whether I have got that wrong. It seems to me to increase the risk of unfair or opaque decisions. The noble Lord, Lord Arbuthnot, talked about the Horizon/Post Office scandal. That should certainly give us pause for thought. The computer does not always get it right. There are myriad examples of AI inventing false information and giving fake answers. It is called “hallucination”. The right to challenge solely automated decisions should be sacrosanct. Why have the Government decided to weaken those safeguards?

Finally, I am pleased to get on to a point that no one else has raised so far, which is an achievement. I note with relief that the abolition of the Biometrics and Surveillance Camera Commissioner has been removed. However, issues remain in these areas. In particular, the previous commissioner has described a lack of an overarching accountability framework around surveillance camera and biometrics usage. Can the Minister explain what the Government’s plans are for the regulation of surveillance camera and biometric use, especially facial recognition and especially as the use of AI expands into that area?

In summary, it is a much better Bill, but there is a lot of work to do.

My Lords, it is a privilege to follow the noble Lord, Lord Vaux. I agree with him completely that this is a better Bill. It is a real tribute to the Minister, and I thank her for how she introduced it.

I start with some good news. This Bill plugs a long-standing gap in our data provisions very handsomely—in providing data for researchers. It has been a real problem for civic society that we have had no reach into the affairs and behaviours of our tech companies, no true understanding of what the activities of their services are, how their algorithms are designed and how they behave, or even what kind of audiences they reach. The most basic questions cannot be answered because we do not have any legal reach or insight into what they are up to. That has been a long-standing inhibitor of accountability and, frankly, of good policy-making.

There were a number of attempts to bring this provision into legislation in the Online Safety Act and the previous data Bill. I am really pleased to see in Clause 123 the information for research about online safety matters provisions, which meet all the requests of those who were pressing for them. I pay tribute to the Minister for ensuring that this is in the Bill, as promised. I pay tribute to my noble friend Lord Camrose. He got these provisions into the previous Bill. Unfortunately, that Bill was pulled at the last minute, but I offer some respect to him too on that point.

This is such an important provision. It should not be overshadowed by the other important contents of this Bill. Can the Minister use the opportunity of the passing of this Bill to flesh out some of the provisions in this important clause? I would like to find a forum to understand how researchers can apply for data access; how the privacy protection measures will be applied; how the government consultation will be put together; and how the grounds for an application might work. There are opportunities for those who resist transparency to use any of these measures to throw a spanner in the works. We owe it to ourselves, having seen these provisions put into the Bill, to ensure that they work as well as intended.

That gap having been plugged, I would like to address two others that are still standing. First is the importance of preventing AI-trained algorithms from creating CSAM, which the noble Baroness, Lady Kidron, spoke movingly about. I pull that out of the many things that she mentioned because it is such a graphic example of how we really must draw a red line under some of the most egregious potential behaviours of AI. I am fully aware that we do not, as a country, want to throw obstacles in the way of progress and the very many good things that AI might bring us. There is a tension between the European and American approaches, on which we seek a position of balance. But if we cannot stop the AI from creating images and behaviours around CSAM, my goodness, what can we stop? Therefore, I ask the Minister to answer the question: why cannot we bring such provision on the face the Bill? I will strongly support any efforts to do so.

Lastly, following the comments from the noble and learned Lord, Lord Thomas, on data processing, I flag the very important issue of transfers of data to areas where there is not any clear adequacy and, in fact, no legal system for implementing the rule of law necessary to stand up standard contractual clauses. Your Lordships will be aware that in countries like China and Russia the rule of law is very lightly applied to matters of data. Protecting British citizens’ data, when it goes to such countries, should be the responsibility of any Government, but that is a very difficult thing to provide for. Huge amounts of data is now travelling across borders to countries where we really do not have any legal reach. The BYD car explosion in the UK is an example of the sheer quantity of data that is going overseas. Genomic information using Chinese genomic machines is an example of where some of that data is now more sensitive. It is a big gap in our data protection laws that we do not have a mechanism for fully accounting for the legal handling of that data. I brought in an amendment to the previous Bill, Amendment 111, which I urge the Minister to look at if she would like to understand this issue more carefully. I give fair warning that I will seek to move a version of that amendment for this Bill.

I, too, thank the Minister for her introduction to this welcome Bill. I feel that most noble Lords have an encyclopaedic knowledge of this subject, having been around the course not just once but several times. As a newcomer to this Bill, I am aware that I have plenty to learn from their experience. I would like to thank the Ada Lovelace Institute, the Public Law Project, Connected by Data and the Open Data Institute, among others, which have helped me get to grips with this complicated Bill.

Data is the oil of the 21st century. It is the commodity which drives our great tech companies and the material on which the large language models of AI are trained. We are seeing an exponential growth in the training and deployment of AI models. As many noble Lords have said, it has never been more important than now to protect personal data from being ruthlessly exploited by these companies, often without the approval of either the data owners or the creators. It is also important that, as we roll out algorithmic use of data, we ensure adequate protections for people’s data. I, too, hope this Bill will soon be followed by another regulating the development of AI.

I would like to draw noble Lords’ attention to a few areas of the Bill which cause me concern. During the debates over the last data protection Bill, I know there were worries over the weakening of data subjects’ protection and the loosening of processing of their data. The Government must be praised for losing many of these clauses, but I am concerned, like some other noble Lords, to ensure adequate safeguards for the new “recognised legitimate interests” power given to data processors. I support the Government’s growth agenda and understand that this power will create less friction for companies when using data for their businesses, but I hope that we will have time later in the passage of the Bill to scrutinise the exemption from the three tests for processing data, particularly the balancing test, which are so important in forcing companies to consider the data rights of individuals. This is especially so when safeguarding children and vulnerable people. The test must not be dropped at the cost of the rights of people whose data is being used.

This concern is reinforced by the ICO stating in its guidance that this test is valuable in ensuring companies do not use data in a way that data subjects would not reasonably expect it to be used. It would be useful in the Explanatory Notes to the Bill to state explicitly that when a data processor uses “recognised legitimate interests”, their assessment includes the consideration of proportionality of the processing activity. Does the Minister agree with this suggestion?

The list of four areas for this exemption has been carefully thought through, and I am glad that the category of democratic engagement has been removed. However, the clause does give future Ministers a Henry VIII power to extend the list. I am worried; I have heard some noble Lords say that they are as well, and that the clause’s inclusion in the previous Bill also concerned other noble Lords. It could allow future Ministers to succumb to commercial interests and add new categories, which might be to the cost of data subjects. The Minister, when debating this power in the previous data Bill, reminded the House that the Delegated Powers and Regulatory Reform Committee said of these changes:

“The grounds for lawful processing of personal data go to the heart of the data processing legislation and therefore in our view should not be capable of being changed by subordinate legislation”.

The Constitution Committee’s report called for the Secretary of State’s powers in this area to be subject to primary and not secondary legislation. Why do these concerns not apply to Clause 70 in this Bill?

I welcome the Government responding to the scientific community’s demand that they should be able to reuse data for scientific, historic or statistical research. There will be many occasions when data was collected for the study of a specific disease and the researchers want to reuse it years later for further study, but they have been restricted by the narrow distinctions between the original and the new purpose. The Government have incorporated recitals from the original GDPR in the Bill, but the changes in Clause 67 must be read against the developments taking place in AI and the way in which it is being deployed.

I understand that the Government have gone to great efforts to set out a clear definition of scientific research in this clause. One criterion is the

“processing for the purposes of technological development or demonstration … so far as those activities can reasonably be described as scientific”,

and another is the publication of scientific papers from the study. But my fear is that AI companies, in their urgent need to scrape datasets for training large language models, will go beyond the policy intention in this clause. They might posit that their endeavours are scientific and may even be supported by academic papers, but when this is combined with the inclusion of commercial activities in the Bill, it opens the way for data reuses in creating AI data-driven products which claim they are for scientific research. The line between product development and scientific research is blurred because of how little is understood about these emerging technologies. Maybe it would help if the Bill set out what areas of commercial activity should not be considered scientific research. Can the Minister share with the House how the clause will stop attempts by AI developers to claim they are doing scientific research when they are reusing data to increase model efficiency and capabilities, or studying their risks? They might even be producing scientific papers in the process.

I have attended a forum with scientists and policymakers from tech companies using the training data for AI who admitted that it is sometimes difficult to define the meaning of scientific research in this context. This concern is compounded by Clause 77, which provides an exemption to Article 13 of the UK GDPR for researchers and archivists to provide additional information to a data subject when reusing their data for different purposes if it requires disproportionate effort to obtain the required information. I understand these provisions are drawn to help reuse medical data, but they could also be used by AI developers to say that contacting people for the reuse of datasets from an already trained AI model requires disproportionate effort. I understand there are caveats around this exemption. However, in an era when AI companies are scraping millions of pieces of data to train their models, noble Lords need to bear in mind it is often difficult for them to get permission from the data subjects before reusing the information for AI purposes.

I am impressed by the safeguards for the exemption for medical research set out in Clause 85. The clause says that medical research should be supervised by a research ethics committee to assess the ethical reuse of the data. Maybe the Government should think about using some kind of independent research committee with standards set by UKRI before commercial researchers are allowed to reuse data.

Like many other noble Lords, I am concerned about the changes to Article 22 of the UK GDPR put forward in Clause 80. I quite understand why the Government want to expand solely automated decision-making in order for decisions to be made quickly and efficiently. However, these changes need to be carefully scrutinised. The clause removes the burden on the data controller to overcome tests before implementing ADM, outside of the use of sensitive information. The new position requires the data subject to proactively ask if they would like a human to be involved in the decision made about them. Surely the original Article 22 was correct in making the processor think hard before making a decision to use ADM, rather than putting the burden on the data subject. That must be the right way round.

There are other examples, which do not include sensitive data, where ADM decisions have been problematic. Noble Lords will know that, during Covid, algorithms were used to predict A-level results which, in many cases, were flawed. None of that information would have been classified as sensitive, yet the decisions made were wrong in too many cases.

Once again, I am concerned about the Henry VIII powers which have been granted to the Secretary of State in new Article 22D(1) and (2). This clause is already extending the use of ADM, but it gives Secretaries of State in the future the power to change by regulation the definition of “meaningful human involvement”. This potentially allows for an expansion of the use of ADM; they could water down the effectiveness of human involvement needed to be considered meaningful.

Likewise, I am worried by the potential for regulations to be used to change the definition of a decision having a “significant adverse effect” on a data subject. The risk is that this could be used to exclude them from the relevant protection, but the decision could nevertheless still have a significant harmful effect on the individual. An example would be if the Secretary of State decided to exclude from the scope of a “significant decision” interim, rather than final, decisions. This could result in the exclusion of a decision taken entirely on the basis of a machine learning predictive tool, without human involvement, to suspend somebody’s universal credit pending an investigation and final decision of whether fraud had actually been committed. Surely some of the anxiety about this potential extension of ADMs would be assuaged by increased transparency around how they are used. The Bill is a chance for the Government to give greater transparency to how ADMs process our information. The result would be to greatly increase public trust.

The Algorithmic Transparency Recording Standard delivers greater understanding about the nature of tools being used in the public sector. However, of the 55 ADM tools in operation, only 9 reports have currently been subject to the ATRS. In contrast, the Public Law Project’s Tracking Automated Government register has identified at least 55 additional tools, with many others still to be uncovered. I suggest that the Government make it mandatory for public bodies to publish information about the ADM systems that they are using on the ATRS hub.

Just as importantly, this is a chance for people to obtain personal information about how an automated decision is made. The result would be that, if somebody is subject to a decision made or supported by AI or an algorithmic tool, they should be notified at the time of the decision and provided with a personalised explanation of how and why it was reached.

Finally, I will look at the new digital verification services trust framework being set up in Part 2. The Government must be praised for setting up digital IDs, which will be so useful in the online world. My life, and I am sure that of many others, is plagued by the vagaries of getting access to the various websites we need to run our lives, and I include the secondary security on our phones, which so often does not work. The effectiveness of this ID will depend on the trust framework that is created and on who is involved in building it.

At the moment, in Clause 28, the Secretary of State must consult the Information Commissioner and such other persons as the Secretary of State sees appropriate. It seems to me that the DVS will be useful only if it can be used across national boundaries. Interoperability must be crucial in a digital world without frontiers. I suggest that an international standards body should be included in the Bill. The most obvious would be W3C, the World Wide Web Consortium, which is the standards body for web technology. It was founded by Sir Tim Berners-Lee and is already responsible for the development of a range of web standards, from HTML to CSS. More than that, it is used in the beta version of the UK digital identity and attributes trust framework and has played a role in both the EU and the Australian digital identity services frameworks. I know that the Government want the Secretary of State to have flexibility in drawing up this framework, but the inclusion of an international standards body in the Bill would ensure that the Minister has them in the forefront of their mind when drawing up this much-needed framework.

The Bill is a wonderful opportunity for our country to build public trust in data-driven businesses and their development. It is a huge improvement on its predecessor; it goes a long way to ensure that the law has protections for data subjects and sets out how companies can lawfully use and reuse data. It is just as crucial in the era of AI that, during the passage of the Bill through the House, we do not leave the door open for personal data to be ruthlessly exploited by the big tech companies. We would all be damaged if that was allowed to happen.

My Lords, when we considered the Data Protection and Digital Information Bill earlier in the year, I confessed to feeling somewhat out of my depth and at the edge of my comfort zone. It is with some trepidation that I enter this debate, in particular following the brilliant speeches by my noble friends Lord Knight and Lord Stevenson, and the noble Baroness, Lady Kidron, who lead in this field and understand the subject much better than I do.

Like others, I am delighted by the changes that the Bill brings forward, not least because, when we were in opposition, the noble Baroness, Lady Jones, and I attacked the DPDI Bill for being incoherent and lacking vision. It was simply a bundle of proposals that many people did not want. Although this data Bill is still a considerably large piece of legislation, it is much narrower and the better for it. Firms and data subjects alike will more easily understand it. Therefore, I hope it keeps us much closer to the EU’s rules, as we approach its crucial review of the UK’s data adequacy decision.

The Minister correctly set this new Bill in the context of driving economic growth. The Bill rightly focuses on harnessing data for economic growth, supporting modern digital government, and improving or seeking to improve the lives of our citizens. The last Administration sought to dramatically water down the rights of data subjects, particularly around data subject access requests and high-risk processing. This Bill has dropped many of those proposals, leaving in place the requirement to have UK-based representatives and maintaining the duties of data protection officers, which include carrying out impact assessments. This should reassure individuals that their data will continue to be kept safe.

The last Bill contained some egregious measures—others have referred to the DWP measures that would have required banks and financial organisations to provide data about accounts linked to benefits claimants, including the state pension. That, thankfully, has gone, with Labour’s promise to introduce a separate Bill tackling fraud and error.

Gone too are the plans to give the Secretary of State a veto on codes of practice prepared by the Information Commissioner, which called into question the commissioner’s independence. Similarly, the Government have taken out plans to abolish the Biometrics and Surveillance Camera Commissioner, as the noble Lord, Lord Vaux, rightly mentioned. A number of colleagues expressed concern about the implications of this, when the use of such data and equipment was becoming more widespread, rather than less. Again, this Bill does not include these measures, leaving those officers and the requirements on them very much in place.

The last Government sought to change data rules on political parties, allowing them to use certain datasets for campaigning purposes. Former Ministers could not properly explain what this would mean in practice or where the request had come from, so removing these measures from the Bill is, again, welcome.

When introduced, the DPDI Bill did not contain measures on coroners’ access to data, despite that having been promised to the noble Baroness, Lady Kidron. As Opposition Front-Benchers, I and the noble Baroness, Lady Jones, made commitments to her that we would take this forward. This Bill delivers on that manifesto commitment. Similarly, we worked collaboratively with the noble Lord, Lord Bethell, during the passage of what became the Online Safety Act to promote access to data for researchers. The inclusion of such measures in this Bill again shows that the Labour Party is following through on its commitments, including those given at the time of the election.

Like other noble Lords, I have received a number of briefing papers, some of which raise highly interesting questions and points. The Law Society says it remains concerned about the UK’s ability to ensure that we meet EU data adequacy standards. I recall that we were concerned about this in opposition. It suggests that Clause 84 deregulates the transfer of data across borders and international organisations, so can the Minister reassure the House and me that this will not put the UK at risk during the 2025 assessment of data adequacy? In a similar vein, can she assure noble Lords that the newly formed information commission will maintain sufficient independence from government and entrench our EU-UK data adequacy in that respect?

Another issue raised during debates on the DPDI Bill was the need to ensure that there is meaningful human involvement in decisions where automated decision-making processes are in play. Other noble Lords have raised this again. Can we have an assurance that the ethical principles of safety, transparency, fairness and contestability will be properly in place?

One other area which I know the Minister is interested in and excited by is the potential of the legislation in relation to the national underground asset register. We probed this in opposition; are we satisfied in government that the move to an NUAR over the next year or so will take into account the existence of the private sector company LinesearchbeforeUdig and ensure that there is a smooth transition to a national network? Given the current impact on critical national infrastructure of £2.4 billion-worth of accidents in our various grids, we must make sure that we harvest the benefit of the new technology to protect that critical part of infra- structure and our national economy.

Finally, I raise the concerns expressed by the News Media Association about the unlicensed use of data created by the media and broader creative industries. It argues that this represents intellectual property theft by AI developers. The consequences of AI firms being free to scrape the web without remunerating creators can be profound. Its long-term fear is that this will reduce investment in trusted journalism. If less and less human-authored intellectual property is produced, tech developers may find ultimately that the high-quality data essential for generative AI is lacking. I realise that this Bill does not cover AI, but it is important that, if we are to drive growth and innovation using AI, we consider developing a dynamic licensing market by making the UK’s copyright regime enforceable. Can the Minister offer some insight into government thinking on that point?

This is a much more narrowly focused Bill, large though it is, and benefits from it. I think the hours we spent earlier in the year interrogating the DPDI Bill were well spent, because they paved the way for this more streamlined and pragmatic approach, which we welcome.

My Lords, it is a pleasure to take part in this Second Reading debate. I thank the Minister for the way she introduced the Bill. I declare my interests as set out in the register, particularly those in technology and financial services: as an adviser to Ecospend, an open banking technology, and to Socially Recruited, an AI business.

It is a pleasure to take part in a Second Reading for the third time on one Bill with three different names. We should all feel grateful that the only word to survive in all those titles is “data”, which must be a good thing. It is also a pleasure to follow so many excellent speeches, to which I find myself simply saying “Yes, agree, agree”, in particular the excellent speech of the noble Baroness, Lady Kidron, who pointed to some of the most extreme and urgent issues that we must address in data. I also support the concept from the noble Lord, Lord Knight, of the Government laying out their overall approach to all new technologies and issues around data so that we have a road map, suite, menu or whatever of everything they intend in the coming months and years, so that we can have clarity and try to enable consistency through all these Bills and statutory measures, which cover so much of our economy and society. As this is the third Second Reading for one Bill, I will cover three issues: smart data, automated decisions and the use of data in training AI.

On smart data, perhaps it would be better for the public if we called it “smart uses of data”. As has been mentioned, open banking is currently the only smart use of data. Perhaps one of the reasons why it has not been mainstreamed or got to a critical level in our society is the brand and rollout of open banking. We should all be rightly proud of the concept’s success— made in the UK and replicated in over 60 jurisdictions around the world, many of which have gone much further than us in a much shorter time. It demonstrates that we know how to do right-sized regulation and shows that we know how to regulate for innovation, consumer protection and citizens’ rights. Yet we are doing so little of this legislation and regulation.

It is one thing to pass that willed regulatory intervention; perhaps the Government and other entities did not do anywhere near enough promotion of the opportunities and possibilities of open banking. If you polled people on main street about open banking, I imagine they would say “I have no idea what you’re talking about; they’ve closed all the branches”. This goes to the heart of the point raised by the noble Lord, Lord Knight. Without a coherent narrative, explained, communicated and connected across our society, it is hardly surprising that we have not only this level of take-up of open banking but this level of connection to all the opportunities around these new technologies.

The opportunities are immense, as set out in this Bill. The extension of smart data into areas such as energy provision could be truly transformational for citizens and bill payers. What is the Government’s plan to communicate these opportunities on the passage of this Bill to make all bill payers, citizens and consumers aware of the opportunities that these smart data, smart energy and smart savings provisions may bring to them?

Secondly, as has rightly and understandably been mentioned by noble Lords, the Bill proposes a significant and material change to automated decision-making. It could be argued that one of the impacts of gen AI has been to cause a tidal wave of automated decisions, not least in recruitment and employment. Somebody may find themselves on the wrong end of a shortlisting decision for a role: an automated decision where the individual did not even know that AI was in the mix. I suggest that that makes as clear a case as any for the need to label all goods and products in which AI is involved.

The Bill seeks to take Article 22 and turn it into what we see in Clause 80. Would the Minister not agree that Clause 80 is largely saying, “It’s over to you, pal”? How can somebody effectively assert their right if they do not even know that AI and automated decision-making were in the mix at the time? Would the Minister not agree that, at the very least, there must be a right for an individual to have a personalised decision to understand what was at play, with some potential for redress if so sought?

Thirdly, on the use of data in training AI, where is the Bill on this most critical point? Our creatives add billions to the UK economy and they enrich our society. They lift our souls, making music where otherwise there may be silence, filling in the blank page with words that change our lives and pictures that elevate the human condition. Yet right now, we allow their works to be purloined without consent, respect or remuneration. What does the Bill do for our creative community, a section of the economy growing at twice the rate of the rest of it?

More broadly, why is the Bill silent when it comes to artificial intelligence, impacting as it does so many elements of our economy, society and individuals’ lives right now? If we are not doing AI in this Bill, when will we be? What are we waiting to know that we do not already know to make a decent effort at AI legislation and regulation?

The danger is that, with so much running through the Bill, if we do not engender a connection with the public then there will be no trust. No matter how much potential there is in these rich datasets and potential models to transform our health, education, mobility and so much more, none of it will come to anything if there is not public trust. I guess we should not be so surprised that, while we all enjoy “Wolf Hall: The Mirror and the Light” every Sunday evening, there is more than a degree of Henry VIII spattered through this Bill as a whole.

I move to some final questions. What is the Government’s position when it comes to the reversal of the burden of proof in computer evidence? We may need to modernise the situation pre-1999, but it should certainly be the case that that evidence is put to proof. We cannot continue with the situation so shamefully and shockingly set out in the Horizon situation, as rightly set out by my noble friend Lord Arbuthnot, who has done more than any in that area.

Similarly, on the Bill in its entirety, has the “I” of GenAI been passed over in the Bill as currently constructed? So many of the clauses and so much of the wording were put together before the arrival of GenAI. Is there not a sense that there is a need for renewal throughout the Bill, with so many clauses at least creaking as a consequence of the arrival of GenAI?

Will the Government consider updating the Computer Misuse Act, legislation which came into being before we had any of this modern AI or modern computing? Will they at least look at a statutory defence for our cyber community, who do so much to keep us all safe but, for want of a statutory defence, have to do so much of that with at least one hand tied behind their back?

Does the Minister believe that this Bill presents the opportunity to move forward with data literacy? This will be required if citizens are to assert their data rights and be able to say of their data, “It is my data and I decide to whom it goes, for what and for what remuneration”?

Finally, what is the Government’s approach to data connected to AI legislation, and when may we see at least a White Paper in that respect?

Data may be, as the Minister said, the DNA of our time, or, as other noble Lords have said, the oil; perhaps more pertinently it may be the plastic of our time, for all that that entails. The critical point is this: it offers so much potential, but not inevitability, to drive economic, social and psychological growth. We need to enable and empower all our citizens to be able to say, full-throatedly, “Our data; our decisions; our human-led digital futures”.

My Lords, I want to get on to the digital verification service. First, I declare that I am very interested in digital twins; there are huge advantages in modelling—for instance, the underground and all the various things that can choke traffic. I went to a very interesting event at Connected Places Catapult, where they are modelling all the inferences on traffic and flows, et cetera, to try to work out how you can alleviate it and get emergency services through when everything is choked up. There is huge advantage in being able to model that, and for that we need data sharing and all the other things.

The other thing I got very interested and involved in, with FIDO and Kaimai, is causal AI. As people say, we need to know how it got there: what sources was it relying on when it reached certain things that it put in the reports or decisions made? It is a very important aspect, because the “I” in AI is not really the right word to use. A computer is not innately intelligent. It is like a child; what you put into it and what it learns from that could well be not what you expected it to learn at all. We have to be very careful of believing too much in it, putting too much faith in it and thinking that it will run the future beautifully.

Here is the bit that I am more interested in. I was talking to my noble friend Lady Kidron just before the debate, and she pointed out something to me because of my previous involvement in chairing the British Standard PAS 1296 on age verification, which we did around the Digital Economy Act when we were worried about all the implications and the pornography issues. The trouble now is that the Government seem to think that the only age verification that matters is checking that someone is over 18, so that they can purchase alcohol and knives and view inappropriate pornography, which should not be there for youngsters. But when we wrote it, we were very careful to make sure that there was no age specified. For educational purposes, there is material that you want to go to particular age cohorts and do not want for children at other ages because it is wrong for their stage of development and knowledge. Also, you need to be able to check that older people are not trying to get into children’s social groups; they must be excludable from them. Age verification, whenever it is referred to, should work in any direction and at any age you want it. It should not be so inflexible.

I was sent a briefing by the Association of Document Validation Professionals and the Age Verification Providers Association. I was very much there when all that started off, when I was chairman of EURIM, which became the Digital Policy Alliance. They represent some 50 attribute and identity providers, and they warmly welcome the Bill and the priority that the new Government are giving to returning it to Parliament. I will highlight the sections of the Bill dealing with digital verification services that they feel would merit further discussion during later stages.

In Clause 27, “Introductory”, ideally there would be a clear purpose statement that the Bill makes certified digital ID legally valid as a proof of identity. This has always been a problem. To progress in the digital world, we will need to ensure that digital verification of ID is given equal standing with physical passports and driving licences. These data are technically only tokens to enable you to cross borders or drive a vehicle, but they are frequently used as proof of ID. This can be done digitally, and the benefit of digital ID is that it is much harder to forge and therefore much more reliable. For some reason, we have always had a fear of that in the past.

In Clause 29, on “supplementary codes”, they are very worried that it could add time and cost to developing these if these processes are owned by the Office for Digital Identities and Attributes—OfDIA. There should be a stronger obligation to co-create with the industry, both in preparing the initial rules and in any revisions. The drafting is, apparently, currently ambiguous about any requirements for consultation. I know that that has been a problem in the past. There will be specialist requirements particular to specific sectors and the OfDIA will not necessarily have the required expertise in-house. There are already organisations in place to do things around each of these codes.

In Clause 33, on registration on the digital verification services register, the changes to the previous Bill around registration processes are welcome and, most notably, the Government have recognised in the Bill the need for national security checks. The problem is that there is no independent appeals mechanism if the Secretary of State refuses to register a DVS or removes it from the register, short of judicial review—and that is both time consuming and very expensive. Most would not be able to survive long enough to bring the case to a conclusion, so we need to think of other remedies, such as some form of appeals tribunal.

In Clause 39, on the fees for registration et cetera, the fees are a new tax on the industry and may go beyond raising sufficient funds for the costs of administering the scheme. They welcome fees now being subject to parliamentary scrutiny, but would like to see a statutory limit on raising more than is required to fund DVS governance. There are figures on it which I could give you, but I will not bore you with them right now.

In Clause 50, on trust marks for use by registered persons, there may be a benefit from more direct linking of the requirements relating to marks of conformity to the Trade Marks Act.

In Clause 51, on the powers of a Secretary of State to require information, this wide-ranging power to demand information may inherently override the Data Protection Act. It extinguishes any obligation of confidentiality owed by a conformity assessment body to its clients, such as the contents of an audit report. The net effect could be to open up audit reports to freedom of information requests, because the exemption to an FoI would be that they were confidential, but the Bill appears to override that, and the way the Bill is structured could mean that the Secretary of State can also override a court order imposing confidentiality. I do not think we should allow that.

Clause 52 is about arrangements for third parties to exercise functions. In its current form, the Office for Digital Identities and Attributes is an unusual regulator. It is not independent from the Government and does not share the features of other regulators. It may therefore not be able to participate in the Digital Regulation Cooperation Forum, for example, based on the powers relied upon by its members to collaborate with other regulators.

The OfDIA may not be in scope of regulatory duty for most regulators to promote growth. It is unclear whether the new regulatory innovation office will have jurisdiction over the OfDIA. It would be helpful to explore whether a more conventional status as an independent regulator would be preferable.

I think that is enough complication for the moment.

I welcome the Bill and thank my noble friend Lady Jones of Whitchurch for her clear introduction. It represents a significant improvement on the Data Protection and Digital Information Bill that we had such fun discussing last year under the previous Government. I thank the noble Viscount, Lord Camrose, for his handling of the Bill at that stage and look forward to continuing these discussions.

However, there are some concerns on which it would be good to have some reassurance from the Government, so I welcome the opportunity to discuss potential improvements during the Bill’s passage through the House. It is also worth bearing in mind the remarks of the noble Lord, Lord Holmes of Richmond, that this is a fast-moving field and there is a continual danger of fighting the last war. I think that is certainly the case in relation to AI. So, time in Committee will have to be spent considering whether there is more that needs to be done because of the way the world has developed.

I am pleased that the Bill no longer covers information for social security purposes. I am not so pleased that it is going to reappear through the separate fraud, error and debt Bill. That is, of course, a discussion for another day; we have not seen it yet. My Government announced it two months ago and we have not yet seen it, so fingers crossed they are having second thoughts.

My prime concern with the Bill, and where I want to ensure that there are adequate safeguards, is individuals’ health data and what the provisions in the Bill mean for patients and for the public. It is notable that one of the key stated purposes of the Bill is to

“build an NHS fit for the future”,

which is of course one of the Government’s five missions.

My noble friend Lord Stevenson of Balmacara, who is not in his place, set out the issues very clearly. Nevertheless, I will repeat them, because I think that the point is so important. We have the problem that data regulation can slow down the pace of data sharing, increase risk aversion and make research and innovation more difficult. That is all true—it is true of all data, but particularly of health data. However, patients and the public rightly expect high standards for data protection, particularly when it comes to their health data, and I am worried that the effects of the Bill are not as strong as might be wished. This will need close examination during its passage through Committee. To get this wrong would damage public trust, negatively impact patient care, complicate the running of the health service and have a harmful effect on academic research and our life sciences industry. We must do our best to ensure that any concerns are misplaced—I hope that I am wrong.

Under current data protection laws there are transparency obligations, which means that information needs to be provided to the data subject that explains the use of their data. Reusing data for a different purpose is currently possible, but under limited circumstances—for example, the UK Health Security Agency. The main point of concern with the Bill, however, is with Clause 77, which, in the words of the BMA,

“will water down the transparency of information to patients”.

I suggest that we have to take the concerns of the BMA most seriously on this, which I am highlighting, but also on the other points it has made. What we have is a situation where data collected for one purpose can be reused for scientific research. In those circumstances, there is not necessarily a requirement to tell the data subjects about it. The definition of “scientific research” is very wide. It can be commercial or non-commercial. It can be funded publicly or privately. It also covers technological development, which is broadening the idea of scientific research.

Clearly, this is thought to be a good thing. It will remove barriers for valuable health research—timely availability of data is something important when you are undertaking research—and it is always possible that, during the course of the research, you can identify things which were not in the original proposal. All that is right, but there is a risk of data being reused for activities that data subjects might not have supported, have no control over and have no knowledge that it is happening. This feels like it contradicts the “no surprises” Caldicott principle. It is unclear to me at this stage who exactly is going to have oversight of all the data reuses to check that they are ethical and to check that the right standards are being applied.

The consequence is a real risk of the loss of patient and public trust in data use and sharing within the health sector and more widely. To reiterate, patients and the public rightly expect high standards of data processing to protect their confidential health data. I have serious concerns that the Bill, in its current state, runs the risk of diluting those standards and protections.

The underlying policy priority for the Bill, as I understand it, is to stimulate innovation through broadening the definition of “scientific research”. However, there is concern—for example, that expressed by the Ada Lovelace Institute—that, as currently written, the provisions in the Bill are susceptible to misuse. We must ensure that the Bill explicitly forbids the mass reuse of personal data scraped from the internet or acquired through social media for AI product development under the auspices of “scientific research”, with the potential for considerable public backlash. Voluntary commitments from the tech industry to protect people from the potential harms of AI models are welcome, of course, but are not good enough. Only hard rules enshrined in law can incentivise the developers and deployers of AI to comply, and empower the regulators to act.

Another unknown at this stage—I hope my noble friend can guide us here—is how far the Bill diverges from EU standards and potentially puts at risk the free flow of personal data between the EU and the UK. This free flow is critical to medical research and innovation and must be maintained.

I am also concerned about the issue of making data anonymous. It is incredibly difficult to make medical data anonymous. It is valueless in most cases if you do not know how old the subject is or their pre-existing conditions, and as soon as you have that sort of data it is open to manipulation. I believe that to counter those problems we need to expand the use of so-called trusted research environments. This is a well-developed technique in which Britain is the leader. I believe it should be a legal requirement in this field. The Bill does not go that far. It is certainly something we should discuss in Committee.

This is a system where the information—the subject’s data—is kept within a locked box. It stays within the box. The medical researchers, who are crucial, come up with their program, using a sandbox, which is then applied to the locked-away data. The researchers would not get the data, they would just get the results of their inquiry. They do not go anywhere near the data. This level of protection is required to achieve public support. The outcome of the research in these circumstances is identical but the subjects’ medical information—crucially, but not only, their genetic information—is kept away and kept secure.

Finally, another point of concern that has been mentioned by a number of speakers is automated decision-making. The Bill removes the general prohibition on automated decision-making, placing responsibility on individuals to enforce their rights rather than on companies to demonstrate why automation is permissible. Even with the new safeguards being introduced, people will struggle to get meaningful explanations about decisions that will deeply affect their lives and will have difficulty exercising their right to appeal against automated decisions when the basis on which the decisions have been made is kept from them.

With those concerns, which I am sure we will discuss in Committee, I support the Bill.

My Lords, it is a great pleasure to follow the noble Lord, Lord Davies, and what he had to say on health data, much of which I agree entirely with. The public demand that we get this right and we really must endeavour to do all we can to reassure the public in this area.

I speak as someone deeply rooted in the visual arts and as an artist member of DACS—the Design and Artists Copyright Society. In declaring my interests, I also express gratitude for the helpful briefing provided by DACS.

The former Data Protection and Digital Information Bill returns to this House after its journey was interrupted by July’s general election. While this renewed Bill’s core provisions remain largely unchanged, the context in which we examine them has shifted significantly. The rapid advancements in artificial intelligence compel us to scrutinise this legislation not just for its immediate impact but for its long-term consequences. Our choices today will shape how effectively we safeguard the rights and interests of our citizens in an increasingly digital society. For this reason, the Bill demands meticulous and thorough examination to ensure that it establishes a robust governance framework capable of meeting present and future challenges.

Over the past year, Members of this House have carefully considered the opportunities and risks of large language models which power artificial intelligence applications—work that is still ongoing. I note that even today, the Lords Communications and Digital Committee, chaired by the noble Baroness, Lady Stowell of Beeston, is holding an evidence session on the role of AI in creative tech.

The committee’s previous inquiry into large language models stressed a need for cautious action. Drawing on expert testimony, its recommendations highlighted critical gaps in our current approach, particularly in addressing immediate risks in areas such as cybersecurity, counterterrorism, child protection, and disinformation. The committee rightly stressed the need for stronger assessments and guardrails to mitigate these harms, including in the area of data protection.

Regrettably, however, this Bill moves in the opposite direction, and instead seeks to lighten the regulatory governance of data processing and relaxes rules around automated decision-making, as other noble Lords have referred to. Such an approach risks leaving our legislative framework ill prepared to address the potential risks that our own committee has so carefully documented.

The creative industries, which contribute £126 billion annually to the UK economy, stand particularly exposed. Evidence submitted to the committee documented systematic unauthorised use of copyrighted works by large language models, which harvest content across the internet while circumventing established licensing frameworks and creator permissions.

This threat particularly impacts visual artists—photographers, illustrators, designers, et cetera—many of whom already earn far below the minimum wage, as others, including the noble Baroness, Lady Kidron, and the noble Lords, Lord Bassam and Lord Holmes, have already highlighted. These creators now confront a stark reality: AI systems can instantaneously generate derivative works that mimic their distinctive styles and techniques, all without attribution or compensation. This is not merely a theoretical concern; this technological displacement is actively eroding creative professionals’ livelihoods, with documented impacts on commission rates and licensing revenues.

Furthermore, the unauthorised use of reliable, trusted data, whether from reputable news outlets or authoritative individuals, fuels the spread of disinformation. These challenges require a solution that enables individuals and entities, such as news publishers, to meaningfully authorise and license their works for a fair fee.

This Bill not only fails to address these fundamental challenges but actively weakens existing protections. Most alarmingly, it removes vital transparency requirements for personal data, including data relating to individual creators, when used for research, archival and statistical purposes. Simultaneously, it broadens the definition of research to encompass “commercial” activities, effectively creating a loophole ripe for exploitation by profit-driven entities at the expense of individual privacy and creative rights.

Finally, a particularly troubling aspect of the Bill is its proposal to dissolve the Information Commissioner’s Office in favour of an information commission—a change that goes far beyond mere restructuring. Although I heard what the Minister said on this, by vesting the Secretary of State with sweeping powers to appoint key commission members, the Bill threatens to compromise the fundamental independence that has long characterised our data protection oversight. Such centralised political influence could severely undermine the commission’s ability to make impartial, evidence-based decisions, particularly when regulating AI companies with close government ties or addressing sensitive matters of national interest. This erosion of regulatory independence should concern us all.

In summary, the cumulative effect of this Bill’s provisions exposes a profound mismatch between the protections our society urgently needs and those this legislation would actually deliver. At a time when artificial intelligence poses unprecedented challenges to personal privacy and creative rights, this legislation, although positive on many fronts, appears worryingly inadequate.

My Lords, I declare an interest in that, through the Good Schools Guide, I am an extensive user of government schools data. With another hat on, I share my noble friend Lord Markham’s worries about how this affects little organisations with a bit of membership data.

I very much look forward to Committee, when we will get into the Bill’s substance. I supported almost everything that the noble Baroness, Lady Kidron, said and look forward to joining in on that. I also very much support what my noble friend Lord Holmes said, in particular about trust, so he will be glad to know that I have in advance consulted Copilot as to the changes they would like to see in the Bill. If I may summarise what they said—noble Lords will note that I have taken the trouble to ascertain their choice of pronouns—they would like to see enhanced privacy safeguards, better transparency and accountability, regular public consultation and reviews of the Act, impact assessments before implementation, support for smaller entities and clearer definition of key terms. I am delighted by how much I find myself in agreement with our future overlords.

To add to what the noble Earl, Lord Erroll, said about digital identity being better, there was a widespread demonstration of that during Covid, when right-to-work checks went digital. Fraud went down as a result.

On the substantial changes that I would like to see, like my noble friend Lord Arbuthnot of Edrom, I would like a clear focus on getting definitions of data right. It is really important that we have stability and precision in data. What has been going on in sex and gender in particular is ridiculous. Like many other noble Lords, I also want a focus on the use of artificial intelligence in hiring. It is so easy now to get AI support for making a job application that the number of job applications has risen hugely. In response to this, of course, AI has been used in assessing job applications, because you really cannot plough through 500 in order to make a shortlist. Like the Better Hiring Institute, which I am associated with, I would really like to see AI used to give people the reasons why they have not been successful. Give everybody a reply and engage everybody in this process, rather than just ignoring them—and I apologise to the many people who send me emails that I do not reply to, but perhaps I will do better with a bit of AI.

This is a very seasonal Christmas tree of a Bill and I shall not be shy of hanging baubles on it when we come to Committee, in the way that many other noble Lords have done. My choices include trying to make it possible for the Student Loans Company to be more adventurous in the use of its data. It ought to be a really good way of finding out how successful our university system is. It is in touch with university graduates in a way that no other organisation is, but it feels constrained in the sorts of questions it might ask. I would really like Action Fraud to record all attempts at fraud, not just the successful frauds. We need a better picture of what is going on there. I would like to see another attempt to persuade the DfE that schools admissions data should be centrally gathered. At the moment it is really hard for parents to use, which means there is a huge advantage for parents who are savvy and have the time. That is not the way it should be. Everybody should have good, intelligent access to understanding what schools are open to them. There will be plenty of opportunities in Committee, which, as I say, I look forward to.

In the context of data and House of Lords reform, when I did a snap census at 5.47 pm, the Cross-Bench Peers were in the majority in the House. That suggests that, in providing Peers who have a real interest in the core business of this House—revising legislation—the process of choosing Cross-Bench Peers does rather better than the process of choosing the rest of us. If we are to reform the House of Lords, getting that aspect into the political selection would be no bad thing. I would also like some data, in the sense of some clear research, on the value of Statement repeats. I cannot recall an occasion when a Statement repeat resulted in any change of government policy of any description. Perhaps other noble Lords can enlighten me.

My Lords, I thank the noble Lord, Lord Lucas, for injecting a bit of reality into this discussion. I declare my interest as a governor of Coram. I thank the noble Lord very much for his comments about the Cross Benches. Perhaps if we let HOLAC choose the Cross Benches and Copilot the political appointees, we might be slightly more successful than we have been in recent years.

I am conscious that I am the only person between your Lordships and the joys of the Front-Bench spokespeople, so I shall not take too long. I welcome the Bill, but I have some concerns. In particular, in reading through the introduction on page 1 line by line, I counted 12 instances of the Bill being “to make provision” and not a single specific mention of protection, which I feel is perhaps a slight imbalance.

I have six areas of concern that I suspect I and many others will want to explore in Committee. I am not going to dive into the detail at this stage, because I do not think it is appropriate.

Like many noble Lords, including the noble Lords, Lord Knight, Lord Vaux and Lord Bassam, and the noble and learned Lord, Lord Thomas, I have some concerns about the extension of the UK’s data adequacy status beyond June next year. Given that one of the key objectives of this Bill is to help economic growth, it is incredibly important that that happens smoothly. It is my hope and expectation that the new His Majesty’s Government will find it slightly less painful and more straightforward to talk to some of our colleagues across the water in the EU to try to understand what each side is thinking and to ease the way of making that happen.

Secondly, like many noble Lords, I have a lot of concern about what is rather inelegantly known as “web scraping”—a term that sounds to me rather like an unpleasant skin rash—of international property and its sheer disregard for IP rights, so undermining the core elements of copyright and the value of unique creative endeavour. It is deeply distasteful and very harmful.

One area that I hope we will move towards in Committee is the different departments of His Majesty’s Government that have an interest in different parts of this Bill consciously working together. In the case of web scraping, I think the engagement of Sir Chris Bryant in his dual role as Minister of State for Data Protection and Minister for Creative Industries will be very important. I hope and expect that the Minister and her colleagues will be able to communicate as directly as possible and have a sort of interplay between us in Committee and that department to make sure that we are getting the answers that we need. It is frankly unfair on the Minister, given that she is covering the ground of I-do-not-know-how-many Ministers down the other end, for her also to take on board the interests, concerns and views of other departments, so I hope that we can find a way of managing that in an integrated way.

Thirdly, like my noble friend Lady Kidron, I am appalled by AI-produced child sexual abuse material. To that extent, I make a direct request to the Minister and the Bill team that they read an article published on 18 October in the North American Atlantic magazine by Caroline Mimbs Nyce, The Age of AI Child Abuse is Here. She writes about what is happening the USA, but it is unpleasantly prescient. She writes,

“child-safety advocates have warned repeatedly that generative AI is now being widely used to create sexually abusive imagery of real children”—

not AI avatars—

“a problem that has surfaced in schools across the country”.

It is certainly already happening here. It will be accelerating. Some of your Lordships may have read about a scandal that has emerged in South Korea. My daughter-in-law is South Korean. AI-created adult sexual material has caused major trauma and a major decline in female mental health. In addition, when it comes to schools, there are real concerns about the ability of companies to scoop up photographic information about children from photos that schools have on their own websites or Facebook pages. Those images can then potentially be used for variety of very unpleasant reasons, so I think that is an area which we would want to look at very carefully.

Fourthly, there are real concerns about the pervasive spread of educational technology—edtech, as it is known informally—driven, understandably, by commercial rather than educational ambition in many cases. We need to ensure that the age-appropriate design code applies to edtech and that is something we should explore. We need to prioritise the creation of a code of practice for edtech. We know of many instances where children’s data has been collected in situations where the educational establishments themselves, although they are charged with safeguarding, are wholly inadequate in trying to do it, partly because they do not really understand it and partly because they do not necessarily have the expertise to do it. It is unacceptable that children in school, a place that should be a place of safety, are inadvertently exposed to potential harm because schools do not have the power, resources and knowledge to protect the children for whom they are responsible. We need to think carefully about what we need to do to enhance their ability to do that.

On my fifth concern, the noble Lords, Lord Stevenson and Lord Holmes, made a very good point, in part about the Bletchley declaration. It would be helpful for us as a country and certainly as Houses of Parliament to have some idea of where the Government think we are going. I understand that the new Government are relatively recently into their first term and are somewhat cautious about saying too much about areas that they might subsequently regret, but I think there is a real appetite for a declaratory vision with a bit of flesh on it. We all understand that it might need to change as AI, in particular, threatens to overtake it, but having a stab at saying where we are, what we are doing, why we are doing it, the direction of travel and what we are going to do to modify it as we go along, because we are going to have to because of AI, would be helpful and, frankly, reassuring.

Lastly, during the passage of the Online Safety Bill, many of us tried to make the case for a Joint Committee to oversee digital regulation and the regulators themselves. I think it would be fair to say that the experience of those of us who were particularly closely involved with what is now the Online Safety Act and the interactions that we have had formally or informally with the regulator since then, and the frustrations that have emerged from those interactions, have demonstrated the value of having a joint statutory committee with all the powers that it would have to oversee and, frankly, to call people to account. It would really concentrate minds and make the implementation of that Act, and potentially this Act, more streamlined, more rapid and more effective. It could be fine-tuned thereafter much more effectively, in particular if we are passing a Bill that I and my fellow members of the Secondary Legislation Scrutiny Committee will have the joy of looking at in the form of statutory instruments. Apart from anything else, having a Joint Committee keep a close watch on the flow of statutory instruments would be enormously helpful.

As we are dealing with areas which are in departments that are not immediately within the remit of the Minister, such as the Department for Education given what I was talking about with schools, anything we can do to make it clear that the left hand knows what the right hand is doing would be extraordinarily helpful. I think there have been precedents in particular cases in Committee when we are dealing with quite detailed amendments for colleagues from other departments to sit on the Bench alongside the Minister to provide real departmental input. That might be a course that we could fruitfully follow, and I would certainly support it.

My Lords, I draw attention to my AI interests in the register. I thank the Minister for her upbeat introduction to the Bill and all her engagement to date on its contents. It has been a real pleasure listening to so many expert speeches this afternoon. The noble Lord, Lord Bassam, did not quite use the phrase “practice makes perfect”, because, after all, this is the third shot at a data protection Bill over the past few years, but I was really taken by the vision and breadth of so many speeches today. I think we all agree that this Bill is definitely better than its two predecessors, but of course most noble Lords went on to say “but”, and that is exactly my position.

Throughout, we have been reminded of the growing importance of data in the context of AI adoption, particularly in the private and public sectors. I think many of us regret that “protection” is not included in the Bill title, but that should go hand in hand if not with actual AI regulation then at least with an understanding of where we are heading on AI regulation.

Like others, I welcome that the Bill omits many of the proposals from the unlamented Data Protection and Digital Information Bill, which in our view— I expect to see a vigorous shake of the head from the noble Viscount, Lord Camrose—watered down data subject rights. The noble Lord, Lord Bassam, did us a great favour by setting out the list of many of the items that were missing from that Bill.

I welcome the retention of some elements in this Bill, such as the digital registration of birth and deaths. As the noble Lord, Lord Knight, said, and as Marie Curie has asked, will the Government undertake a review of the Tell Us Once service to ensure that it covers all government departments across the UK and is extended to more service providers?

I also welcome some of the new elements, in particular amendments to the Online Safety Act—essentially unfinished business, as far back as our Joint Committee. It was notable that the noble Lord, Lord Bethell, welcomed the paving provisions regarding independent researchers’ access to social media and search services, but there are questions even around the width of that provision. Will this cover research regarding non-criminal misinformation on internet platforms? What protection will researchers conducting public interest research actually receive?

Then there is something that the noble Baroness, Lady Kidron, Ian Russell and many other campaigners have fought for: access for coroners to the data of young children who have passed away. I think that will be a milestone.

The Bill may need further amendment. On these Benches we may well put forward further changes for added child protection, given the current debate over the definition of category 1 services.

There are some regrettable omissions from the previous Bill, such as those extending the soft opt-in that has always existed for commercial organisations to non-commercial organisations, including charities. As we have heard, there are a considerable number of unwelcome retained provisions.

Many noble Lords referred to “recognised legitimate interests”. The Bill introduces to Article 6 of the GDPR a new ground of recognised legitimate interest, which counts as a lawful basis for processing if it meets any of the descriptions in the new Annex 1 to the GDPR in Schedule 4 of the Bill. The Bill essentially qualifies the public interest test under Article 6(1)(e) of the GDPR and, as the noble Lord, Lord Vaux, pointed out, gives the Secretary of State powers to define additional recognised legitimate interests beyond those in the annex. This was queried by the Constitution Committee, and we shall certainly be kicking the tyres on that during Committee. Crucially, there is no requirement for the controller to make any balancing test, as the noble Viscount, Lord Colville, mentioned, taking the data subject’s interests into account. It just needs to meet the grounds in the annex. These provisions diminish data protection and represent a threat to data adequacy, and should be dropped.

Almost every noble Lord raised the changes to Article 22 and automated decision-making. With the exception of sub-paragraph (d), to be inserted by Clause 80, the provisions are very similar to those of the old Clause 14 of the DPDI Bill in limiting the right not to be subject to automated decision-making processing or profiling to special category data. Where automated decision-making is currently broadly prohibited with specific exceptions, the Bill will permit it in all but a limited set of circumstances. The Secretary of State is given the power to redefine what ADM actually is. Again, the noble Viscount, Lord Colville, was right in how he described what the outcome of that will be. Given the Government’s digital transformation agenda in the public sector and the increasing use of AI in the private sector, this means increasing the risk of biased and discriminatory outcomes in ADM systems.

Systems such as HART, which predicted reoffending risk, PredPol, which was used to allocate policing resources based on postcodes, and the gangs matrix, which harvests intelligence, have all been shown to have had discriminatory effects. It was a pleasure to hear what the noble Lord, Lord Arbuthnot, had to say. Have the Government learned nothing from the Horizon scandal? As he said, we need to move urgently to change the burden of proof for computer evidence. What the noble Earl, Lord Errol, said, in reminding us of the childlike learning abilities of AI, was extremely important in that respect. We should not put our trust in that way in the evidence given by these models.

ADM safeguards are critical to public trust in AI, and our citizens need greater not less protection. As the Ada Lovelace Institute says, the safeguards around automated decision-making, which exist only in data protection law, are more critical than ever in ensuring that people understand when a significant decision about them is being automated, why that decision has been made, and the routes to challenge it or ask for it to be decided by a human. The noble Viscount, Lord Colville, and the noble Lord, Lord Holmes, set out that prescription, and I entirely agree with them.

This is a crucial element of the Bill but I will not spend too much time on it because, noble Lords will be very pleased to hear, I have a Private Member’s Bill on this subject, providing much-needed additional safe- guards for ADM in the public sector, coming up on 13 December. I hope noble Lords will be there and that the Government will see the sense of it in the meantime.

We have heard a great deal about research. Clause 68 widens research access to data. There is a legitimate government desire to ensure that valuable research does not have to be discarded because of a lack of clarity around reuse or because of very narrow distinctions between the original and new purpose. However, it is quite clear that the definition of scientific research introduced by the Bill is too broad and risks abuse by commercial interests. A number of noble Lords raised that, and I entirely agree with the noble Baroness, Lady Kidron, that the Bill opens the door to data reuse and mass data scraping by any data-driven product development under the auspices of scientific research. Subjects cannot make use of their data rights if they do not even know that their data is being processed.

On overseas transfers, I was very grateful to hear what the noble and learned Lord, Lord Thomas, had to say about data adequacy, and the noble Lords, Lord Bethell, Lord Vaux and Lord Russell, also raised this. All of us are concerned about the future of data adequacy, particularly the tensions that are going to be created with the new Administration in the US if there are very different bases for dealing with data transfer between countries.

We have concerns about the national security provisions. I will not go into those in great detail, but why do the Government believe that these clauses are necessary to safeguard national security?

Many noble Lords raised the question of digital verification services. It was very interesting to hear what the noble Earl, Lord Erroll, had to say, given his long-standing interest in this area. We broadly support the provisions, but the Constitution Committee followed the DPRRC in criticising the lack of parliamentary scrutiny of the framework to be set by the Secretary of State or managed by DSIT. How will they interoperate with the digital identity verification services being offered by DSIT within the Government’s One Login programme?

Will the new regulator be independent, ensure effective governance and accountability, monitor compliance, investigate malicious actors and take enforcement action regarding these services? For high levels of trust in digital ID services, we need high-quality governance. As the noble Lord, Lord Vaux, said, we need to be clear about the status of physical ID alongside that. Why is there still no digital identity offence? I entirely agreed with what the noble Lords, Lord Lucas and Lord Arbuthnot, said about the need for factual clarity underlying the documents that will be part of the wallet—so to speak—in terms of digital ID services. It is vital that we distinguish and make sure that both sex and gender are recorded in our key documents.

There are other areas about which we on these Benches have concerns, although I have no time to go through them in great detail. We support the provisions on open banking, which we want to see used and the opportunities properly exploited. However, as the noble Lord, Lord Holmes, said, we need a proper narrative that sells the virtues of open banking. We are concerned that the current design allows landlords to be given access to monitoring the bank accounts of tenants for as long as an open banking approval lasts. Smart data legislation should mandate that the maximum and default access duration be no longer than 24 hours.

A formidable number of noble Lords spoke about web trawling by AI developers to train their models. It is vital that copyright owners have meaningful control over their content, and that there is a duty of transparency and penalties for scraping news publisher and other copyrighted content.

The noble and learned Lord, Lord Thomas, very helpfully spoke about the Government’s ECHR memorandum. I do not need to repeat what he said, but clearly, this could lead to a significant gap, given that the Retained EU Law (Revocation and Reform) Act 2023 has not been altered and is not altered by this Bill.

There are many other aspects to this. The claims for this Bill and these provisions are as extravagant as for the old one; I think the noble Baroness mentioned the figure of £10 billion at the outset. We are in favour of growth and innovation, but how will this Bill also ensure that fundamental rights for the citizen will be enhanced in an increasingly AI-driven world?

We need to build public trust, as the noble Lord, Lord Holmes, and the noble Baroness, Lady Kidron, said, in data sharing and access. To achieve the ambitions of the Sudlow review, there are lessons that need to be learned by the Department of Health and the NHS. We need to deal with edtech, as has been described by a number of noble Lords. All in all, the Government are still not diverging enough from the approach of their predecessor in their enthusiasm for the sharing and use of data across the public and private sectors without the necessary safeguards. We still have major reservations, which I hope the Government will respond to. I look forward—I think—to Grand Committee.

My Lords, let me start by repeating the thanks others have offered to the Minister for her ongoing engagement and openness, and to the Bill team for their—I hope ongoing—helpfulness.

Accessing and using data safely is a deeply technical legislative subject. It is, perhaps mysteriously, of interest to few but important to more or less everyone. Before I get started, I will review some of the themes we have been hearing about. Given the hour, I will not go into great detail about most of them, but I think it is worth playing some of them back.

The first thing that grabbed me, which a number of noble Lords brought up, was the concept of data as an asset. I believe the Minister used the phrase “data as DNA”, and that is exactly the right metaphor. Whether data is a sovereign asset or on the balance sheet of a private organisation, that is an incredibly important and helpful way to see it. A number of noble Lords brought this up, including the noble Baroness, Lady Kidron, and the noble Lords, Lord Knight and Lord Stevenson of Balmacara.

I was pleased that my noble friend Lord Lucas brought up the use of AI in hiring, if only because I have a particular bee in my bonnet about this. I have taken to writing far too many grumpy letters to the Financial Times about it. I look forward to engaging with him and others on that.

I was pleased to hear a number of noble Lords raise the issue of the burdens on small business and making sure that those burdens, in support of the crucial goal of protecting privacy, do not become disproportionate relative to the ability of small businesses to execute against them. The noble and learned Lord, Lord Thomas, the noble Lords, Lord Stevenson of Balmacara and Lord Bassam, and my noble friend Lord Markham brought that up very powerfully.

I have cheated by making an enormous group of themes, including ADM, AI and text and data mining—and then I have added Horizon on at the end. It is thematically perhaps a little ambitious, but we are getting into incredibly important areas for the well-being and prosperity of so many people. A great many noble Lords got into this very persuasively and compellingly, and I look forward to a great deal of discussion of those items as we go into Committee.

Needless to say, the importance of adequacy came up, particularly from the noble Lords, Lord Vaux and Lord Bassam, and the noble and learned Lord, Lord Thomas. There is a key question here: have we reduced the risk of loss of adequacy to as close to zero as we can reasonably get, while recognising that it is a decision that is essentially out of our sovereign hands?

A number of noble Lords brought up the very tricky matter of the definition of scientific research—among them the noble Viscount, Lord Colville, my noble friend Lord Bethell and the noble Lords, Lord Davies of Brixton and Lord Freyberg. This is a significant challenge to the effectiveness of the legislation. We all know what we are trying to achieve, but the skill and the art of writing it down is a considerable challenge.

My final theme, just because I so enjoyed the way in which it was expressed by the noble Lord, Lord Knight, is the rediscovery of the joys of a White Paper. That is such an important point—to have the sense of an overall strategy around data and technology as well as around the various Bills that came through in the previous Parliament and will, of course, continue to come now, as these technologies develop so rapidly.

My noble friend Lord Markham started by saying that we on these Benches absolutely welcome the Government’s choice to move forward with so many of the provisions originally set out in the previous Government’s DPDI Bill. That Bill was built around substantial consultation and approved by a range of stakeholders. We are particularly pleased to see the following provisions carried forward. One is the introduction of a national underground asset register. As many others have said, it will not only make construction and repairs more efficient but make them safer for construction workers. Another is giving Ofcom the ability, when notified by the coroner, to demand that online service providers retain data in the event of any child death. I notice the noble Baroness, Lady Kidron, nodding at that—and I am delighted that it remains.

On reforming and modernising the ICO, I absolutely take the point raised by some that this is an area that will take quite considerable questioning and investigation, but overall the thrust of the purpose of modernising that function is critical to the success of the Bill. We absolutely welcome the introduction of a centralised digital ID verification framework, recognising noble Lords’ concerns about it, of course, and allowing law enforcement bodies to make greater use of biometric data for counterterrorism purposes.

That said, there are provisions that were in the old DPDI Bill whose removal we regret, many of which we felt would have improved data protection and productivity by offering SMEs in particular greater agency to deal with non-high-risk data in less cumbersome ways while still retaining the highest protections for high-risk data. I very much welcome the views so well expressed by the noble and learned Lord, Lord Thomas of Cwmgiedd, on this matter. As my noble friend Lord Markham put it, this is about being wisely careful but not necessarily hyper-careful in every case. That is at least a way of expressing the necessary balance.

I regret, for example—the noble Lord, Lord Clement-Jones, possibly regrets this less than I do—that the Government have chosen to drop the “vexatious and excessive” standard for subject access requests to refer to “manifestly unfounded or excessive”. The term “vexatious” emerged from extensive consultation and would, among other things, have prevented the use of SARs to circumvent courts’ discovery processes. I am concerned that, by dropping this definition, the Government have missed an opportunity to prevent misuse of the deeply important subject access rights. I hope very much to hear from the Minister how the Government propose to address such practices.

In principle, we do not approve of the Government giving themselves the power to gain greater knowledge of citizens’ activities. Indeed, the Constitution Committee has made it clear that any legislation dealing with data protection must carefully balance the use of personal data by the state for the provision of services and for national security purposes against the right to a private life and freedom of expression. We on these Benches feel that, on the whole, the DPDI Bill maintained the right balance between those two opposing legislative forces. However, we worry that the DUA Bill, if used in conjunction with other powers that have been promised in the fraud, error and debt Bill, would tip too far in favour of government overreach.

Part 1 of the Bill, on customer and business data, contains many regulation-making powers. The noble Viscount, Lord Colville, my noble friend Lord Holmes and the noble Lord, Lord Russell, spoke powerfully about this, and I would like to express three concerns. First, the actual regulations affecting vast quantities of business and personal data are not specified in the Bill; they will be implemented through secondary legislation. Will the Minister give us some more information, when she stands up, about what these regulations may contain? This concern also extends to Part 2, on digital verification services, where in Clause 28,

“The Secretary of State must prepare and publish … rules concerning the provision of digital verification services”.

The Select Committee on the Constitution has suggested that this power should be subject to parliamentary scrutiny. I must say that I am minded to agree.

Secondly, throughout Part 1, regulation-making powers are delegated to both the Secretary of State and the Treasury. This raises several questions. Can the Secretary of State and the Treasury make regulations independently of one another? In the event of a disagreement between these government departments, who has the final say, and what are the mechanisms should they disagree? We would welcome some commentary and explanation from the Minister.

Thirdly, as the Select Committee on the Constitution has rightly pointed out, Clause 133 contains a Henry VIII power. It allows the Secretary of State, by regulations, to make consequential amendments to the provisions made by this Bill. This allows amendments to any

“enactment passed or made before the end of the Session in which this Act is passed”.

Why is this necessary?

The Bill introduces some exciting new terminology, namely “data holder” and data “trader”. Will the Minister tell the House what these terms mean and why they need to coexist alongside the existing terminology of “data processor” and “data controller”? I certainly feel that data legislation is quite complex enough without adding overlapping new terminology if we do not really need it.

I stress once again the concerns rightly raised by my noble friend Lord Markham about NUAR security. Are the Government satisfied that the operational protection of NUAR is sufficient to protect this valuable information from terrorist and criminal threats? More generally, additional cybersecurity measures must be implemented to protect personal data during this mass digitisation push. Will the Minister tell the House how these necessary security measures will be brought forward?

Finally, as I am sure all noble Lords will recall, the previous Government published a White Paper that set out five principles for AI. As a reminder, those were: safety, security and robustness; appropriate transparency and explainability; fairness; accountability and governance; and contestability and redress. I am minded to table an amendment to Clause 80, requiring those using AI in their automated decision-making process to have due regard for these five principles. I noted with interest that the noble Lord, Lord Stevenson of Balmacara, proposed something very similar but using the Bletchley principles. I am very keen to explore that further, on the grounds that it might be an interesting way of having principles-driven AI inserted into this critical Bill.

In conclusion, we on these Benches are broadly supportive of the Bill. We do, as I have set out, have a few concerns, which I hope the Minister will be willing to listen to.

My Lords, I thank all noble Lords for what has genuinely been a fascinating, very insightful debate. Even though I was part, I think, of my noble friend Lord Stevenson’s gang that has been working on this for some time, one learns new things, and I have learned new things again today about some of the issues that are challenging us. So I thank noble Lords for their contributions this evening, and I am very pleased to hear that a number of noble Lords have welcomed the Government’s main approach to the Bill, though of course beyond that there are areas where our concerns will diverge and, I am sure, be subject to further debate. I will try to clarify the Government’s thinking. I am sure noble Lords will understand, because we have had a very wide-ranging discussion, that if I am not able to cover all points, I will follow those up in writing.

I shall start with smart data. As was raised by my noble friend Lord Knight of Weymouth, and other noble Lords, the Government are keen to establish a smart data economy that brings benefits to consumers across all sectors.

Through the Smart Data Council, the Government are working closely to identify areas where smart data schemes might be able to bring more benefits. I think the point was made that we are perhaps not using it sufficiently at the moment. The Government intend to communicate where and in what ways smart data schemes can support innovation and growth and empower customers across a spectrum of markets—so there is more work to be done on that, for sure. These areas include providing the legislative basis for the fuel finder service announced by the Department for Energy Security and Net Zero, and supporting an upcoming call for evidence on the smart data scheme for the energy sector. Last week, the Government set out their priorities for the future of open banking in the national payments vision, which will pave the way for the UK to lead in open finance.

I turn now to digital identity, as raised by the noble Earl, Lord Erroll, and a number of other noble Lords. The measures in the Bill aim to help people and businesses across Britain to use innovative digital identity technologies and to realise their benefits with confidence. As the noble Lord, Lord Arbuthnot, said, the Bill does not make digital identities mandatory. The Bill will create a legislative structure of standards, governance and oversight for digital verification services that wish to appear on a government register, so that people will know what a good digital identity looks like. It is worth saying that a lot of these digital verification schemes already exist; we are trying to make sure that they are properly registered and have oversight. People need to know what a good digital identity looks like.

The noble Lord, Lord Arbuthnot, raised points about Sex Matters. Digital verification services can be used to prove sex or gender in the same way that individuals can already prove their sex using their passport, for example. Regarding the concerns of the noble Lord, Lord Vaux, about the inclusion of non-digital identity, the Government are clear that people who do not want to use digital identity or the digital verification services can continue to access services and live their daily lives referring to paper documents when they need to. Where people want to use more technology and feel left behind, DSIT is working hard to co-ordinate government work on digital inclusion. This is a high priority for the Government, and we hope to come back with further information on that very soon.

The Office for Digital Identities and Attributes has today published its first digital identity inclusion monitoring report. The results show a broadly positive picture of inclusion at this early stage of the markets, and its findings will inform future policy interventions.

I would like to reassure the noble Lord, Lord Markham, and the noble Viscount, Lord Camrose, that NUAR takes advantage of the latest technologies to ensure that data is accessed only for approved purposes, with all access audited. It also includes controls, developed in collaboration with the National Protective Security Authority, the National Cyber Security Centre and the security teams of asset owners themselves.

We had a very wide-ranging debate on data protection issues, and I thank noble Lords for their support for our changes to this legislation. The noble Viscount, Lord Camrose, and others mentioned delegated powers. The Government have carefully considered each delegated power and the associated parliamentary procedure and believe that each is proportionate. The detail of our rationale is set out in our delegated powers memorandum.

Regarding the concerns of the noble Lord, Lord Markham, and the noble Viscount, Lord Camrose, about the effect of the legislation on SMEs, we believe that small businesses would have struggled with the lack of clarity in the term “high-risk processing activities” in the previous Bill, which could have created more burdens for SMEs. We would prefer to focus on how small businesses can be supported to comply with the current legislation, including through user-friendly guidance on the ICO’s small business portal.

Many noble Lords, including the noble Viscount, Lord Camrose, the noble and learned Lord, Lord Thomas, and the noble Lord, Lord Vaux, raised EU adequacy. The UK Government recognise the importance of retaining our personal data adequacy decisions from the EU. I reassure the noble Lord, Lord Vaux, and my noble friend Lord Bassam that Ministers are already engaging with the European Commission, and officials will actively support the EU’s review process in advance of the renewal deadline next year. The free flow of personal data between the UK and the EU is one of the underpinning actions that enables research and innovation, supports the improvement of public services and keeps people safe. I join the noble Lord, Lord Vaux, in thanking the European Affairs Committee for its work on the matter. I can reassure him and the committee that the Secretary of State will respond within the required timeframe.

The noble Lord, Lord Bethell, and others raised international data transfers. Controllers and processors must take reasonable and proportionate steps to satisfy themselves that, after the international transfer, the level of protection for the data subject will be “not materially lower” than under UK data protection law. The Government take their responsibility seriously to ensure that data and its supporting infrastructure are secure and resilient.

On the question from the noble Viscount, Lord Colville, about the new recognised legitimate interest lawful ground, the entire point of the new lawful ground is to provide more legal certainty for data controllers that they are permitted to process personal data for the activities mentioned in new Annexe 1 to the UK GDPR. However, the processing must still be necessary and proportionate and meet all other UK GDPR requirements. That includes the general data protection principles in Article 5 of the UK GDPR, and the safeguards in relation to the processing of special category data in Article 9.

The Bill has significantly tightened up on the regulation-making power associated with this clause. The only processing activities that can be added to the list of recognised legitimate interests are those that serve the objectives of public interest, as described in Article 23(1) of the UK GDPR. The Secretary of State would also have to have regard to people’s rights and the fact that children may be less aware of the risks and consequences of the processing of their data before adding new activities to the list.

My noble friends Lord Davies of Brixton and Lord Stevenson of Balama—do you know, I have never had to pronounce his full name—Balmacara, raised NHS data. These clauses are intended to ensure that IT providers comply with relevant information standards in relation to IT use for health and adult social care, so that, where data is shared, it can be done in an easier, faster and cheaper way. Information standards create binding rules to standardise the processing of data where it is otherwise lawful to process that data. They do not alter the legal obligations that apply in relation to decisions about whether to share data. Neither the Department of Health and Social Care nor the NHS sells data or provides it for purely commercial purposes such as insurance or marketing purposes.

With regard to data assets, as raised by the noble Baroness, Lady Kidron, and my noble friend Lord Knight of Weymouth, the Government recognise that data is indeed one of the most valuable assets. It has the potential to transform public services and drive cutting-edge innovation. The national data library will unlock the value of public data assets. It will provide simple, secure and ethical access to our key public data assets for researchers, policymakers and businesses, including those at the frontier of AI development, and make it easier to find, discover and make connections across those different databases. It will sit at the heart of an ambitious programme of reform that delivers the incentives, investment and leadership needed to secure the full benefits for people and the economy.

The Government are currently undertaking work to design the national data library. In its design, we want to explore the best models of access so that public sector data benefits our society, much in the way that the noble Baroness, Lady Kidron, outlined. So, decisions on its design and implementation will be taken in due course.

Regarding the concerns of the noble Lord, Lord Markham, about cybersecurity, as announced in the King’s Speech, the Government will bring forward a cybersecurity and resilience Bill this Session. The Bill will strengthen our defences and ensure that more essential digital services than ever before are protected.

The noble Baroness, Lady Kidron, the noble Viscount, Lord Colville, and my noble friend Lord Stevenson of Balmacara, asked about the Government’s plans to regulate AI and the timing of this legislation. As set out in the King’s Speech, the Government are committed to establishing appropriate legislation for companies developing the most powerful AI systems. The Government will work with industry, civil society and experts across the UK before legislation is drawn up. I look forward to updating the House on these proposals in due course. In addition, the AI opportunities action plan will set out a road map for government to capture the opportunities of AI to enhance growth and productivity and create tangible benefits for UK citizens.

Regarding data scraping, as raised by the noble Baroness, Lady Kidron, the noble Viscount, Lord Colville of Culross, and others, although it is not explicitly addressed in the data protection legislation, any such activity involving personal data would require compliance with the data protection framework, especially that the use of data must be fair, lawful and transparent.

A number of noble Lords talked about AI in the creative industries, particularly the noble Lords, Lord Holmes and Lord Freyberg—

I am sorry to interrupt what is a very fluent and comprehensive response. I do not want to break the thread, but can I press the Minister a little bit on those companies whose information which is their intellectual property is scraped? How will that be resolved? I did not pick up from what the Minister said that there was going to be any action by the Government. Are we left where we are? Is it up to those who feel that their rights are being taken away or that their data has been stolen to raise appropriate action in the courts?

I was going to come on to some of those issues. Noble Lords talked about AI in the creative industries, which I think my noble friend is particularly concerned about. The Government are working hard on this and are developing an effective approach that meets the needs of the UK. We will announce more details in due course. We are working closely with relevant stakeholders and international partners to understand views across the creative sector and AI sectors. Does that answer my noble friend’s point?

With respect, it is the narrow question that a number of us have raised. Training the new AI systems is entirely dependent on them being fed vast amounts of material which they can absorb, process and reshape in order to answer questions that are asked of them. That information is to all intents and purposes somebody else’s property. What will happen to resolve the barrier? At the moment, they are not paying for it but just taking it—scraping it.

Perhaps I may come in too. Specifically, how does the data protection framework change it? We have had the ICO suggesting that the current framework works perfectly well and that it is the responsibility of the scrapers to let the IP holders know, while the IP holders have not a clue that it is being scraped. It is already scraped and there is no mechanism. I think we are a little confused about what the plan is.

I can certainly write to noble Lords setting out more details on this. I said in response to an Oral Question a few days ago that my honourable friend Minister Clark in DSIT and Chris Bryant, whom the noble Lord, Lord Russell, mentioned, are working jointly on this. They are looking at a proposal that can come forward on intellectual property in more detail. I hope that I can write to noble Lords and set out more detail on that.

On the question of the Horizon scandal and the validity of computers, raised, quite rightly, by the noble Lords, Lord Arbuthnot and Lord Holmes, and the noble Baroness, Lady Kidron, I think we all understand that the Horizon scandal was a terrible miscarriage of justice, and the convictions of postmasters who were wrongly convicted have been rightly overturned. Those Post Office prosecutions relied on assertions that the Horizon system was accurate and reliable, which the Post Office knew to be wrong. This was supported by expert evidence, which it knew to be misleading. The issue was not, therefore, purely about the reliability of the computer-generated evidence. Almost all criminal cases rely to some extent on computer evidence, so the implications of amending the law in this area are far- reaching, a point made by several noble Lords. The Government are aware that this is an issue, are considering this matter very carefully and will announce next steps in due course.

Many noble Lords, including the noble Lords, Lord Clement-Jones, Lord Vaux and Lord Holmes of Richmond, and the noble and learned Lord, Lord Thomas, raised automated decision-making. I noted in my opening speech how the restored accountability framework gives us greater confidence in ADM, so I will not go over that again in detail. But to explain the Bill’s drafting, I want to reassure and clarify for noble Lords that the Bill means that the organisation must first inform individuals if a legal or significant decision has been taken in relation to them based solely on automated processing, and then they must give individuals the opportunity to challenge such decisions, obtain human intervention for them and make representations about them to the controller.

The regulation-making powers will future-proof the ADM reforms in the Bill, ensuring that the Government will have the powers to bring greater legal certainty, where necessary and proportionate, in the light of constantly evolving technology. I reiterate that there will be the right to human intervention, and it will be on a personal basis.

The noble Baroness, Lady Kidron, and the noble Lords, Lord Russell of Liverpool and Lord Clement-Jones, raised concerns about edtech. The Government recognise that concerns have been raised about the amount of personal data collected by education technology used in schools, and whether this is fully transparent to children and parents. The Department for Education is committed to improving guidance and support for schools to help them better navigate this market. For example, its Get Help with Data Protection in Schools project has been established to help schools develop guidance and tools to help them both understand and comply with data protection legislation. Separately, the ICO has carried out a series of audits on edtech service providers, assessing privacy risks and potential non-compliance with data protection regulations in the development, deployment and use of edtech solutions in schools.

The creation of child sexual abuse material, CSAM, through all mediums including AI—offline or online—is and continues to be illegal. This is a forefront priority for this Government and we are considering all levers that can be utilised to fight child sexual abuse. Responsibility for the law in this area rests with the Home Office; I know it is actively and sympathetically looking at this matter and I understand that my colleague the Safeguarding Minister will be in touch with the noble Baroness, Lady Kidron, and the noble Lord, Lord Bethell, ahead of Committee.

I can see that I am running out of time so, rather than testing noble Lords’ patience, will draw my comments to a close. I have not picked up all the comments that colleagues made, but I thank everybody for their excellent contributions. This is the beginning of a much longer conversation, which I am very much looking forward to, as I am to hearing all those who promised to participate in Committee. I am sure we will have a rich and interesting discussion then.

I hope I have persuaded some noble Lords that the Bill is not only wide ranging but has a clear and simple focus, which is about growing the economy, creating a modern, digital government and, most importantly, improving people’s lives, which will be underpinned by robust personal data protection. I will not say any more at this stage. We will follow up but, in the meantime, I beg to move.

Bill read a second time.

Commitment and Order of Consideration Motion

Moved by

That the Bill be committed to a Grand Committee, and that it be an instruction to the Grand Committee that they consider the Bill in the following order: Clauses 1 to 56, Schedule 1, Clauses 57 and 58, Schedule 2, Clauses 59 to 65, Schedule 3, Clauses 66 to 70, Schedule 4, Clause 71, Schedule 5, Clauses 72 to 80, Schedule 6, Clauses 81 to 84, Schedules 7 to 9, Clauses 85 to 102, Schedule 10, Clauses 103 to 107, Schedule 11, Clauses 108 to 111, Schedule 12, Clauses 112 and 113, Schedule 13, Clauses 114 and 115, Schedule 14, Clauses 116 to 119, Schedule 15, Clause 120, Schedule 16, Clauses 121 to 138, Title.

Motion agreed.