Second Reading
Scottish, Welsh and Northern Ireland Legislative Consent sought.
Moved by
That the Bill be now read a second time.
My Lords, in a time of rapid technological change, we need people to trust in how we can use data for greater good. By building understanding and confidence in the rules surrounding how we use data, we can unlock its real potential, not only for businesses but for people going about their everyday lives.
In 2018 Parliament passed the Data Protection Act, which was the UK’s implementation of the EU general data protection regulation. While the EU GDPR protected the privacy rights of individuals, there were unintended consequences. It resulted in high costs and a disproportionate compliance burden for small businesses. These reforms deliver on the Government’s promise to use the opportunity afforded to us by leaving the European Union to create a new and improved UK data rights regime.
The Bill has five parts that deliver on individual elements of these reforms. Part 1 updates and simplifies the UK GDPR and DPA 2018 to ease compliance burdens on businesses and introduce safeguards from new technologies. It also updates the similar regimes that apply to law enforcement agencies and intelligence services. Part 2 enables DSIT’s digital verification services policy, giving people secure options to prove their identity digitally across different sectors of the economy if they choose to do so. Part 3 establishes a framework to set up smart data schemes across the economy. Part 4 reforms the privacy and electronic communications regulations—PECR—to bring stronger protection for consumers against nuisance calls. It also contains reforms to ensure the better use of data in health and adult social care, law enforcement and security. Part 5 will modernise the Information Commissioner’s Office by making sure that it has the capabilities and the powers to tackle organisations that breach data rules, giving the ICO freedom to better allocate its resources and ensuring that it is more accountable to Parliament and to the public.
I stress that the Bill will continue to maintain the highest standards of data protection that people rightly expect. It will also help those who use our data to make our lives healthier, safer and more prosperous. That is because we have convened industry leaders and experts to codesign the Bill with us throughout its creation. This legislation will ensure that our regulation reflects the way in which real people live their lives and run their businesses.
On Report in the other place, we tabled a number of amendments to strengthen the fundamental elements of the Bill and to reflect the Government’s commitment to unleash the power of data across our economy and society. I take this opportunity to thank Members of Parliament and the numerous external stakeholders who have worked with us to ensure that the Bill functions at its absolute best. Taken together, these amendments will benefit the economy by £10.6 billion over 10 years. This is more than double the estimated impact of the Bill when introduced in the spring.
These reforms are expected to lower the compliance burden on businesses. We expect small and micro-businesses to achieve greater overall compliance cost savings than larger business. We expect these compliance cost savings for small and micro-business compliance to be approximately £90 million a year as a result of the domestic data protection policies in the Bill.
The Bill makes it clear that the amount that any organisation needs to do to comply and demonstrate compliance should be directly related to the risk its processing activities pose to individuals. That means that in the future, organisations will have to keep records of their processing activities, undertake risk assessments and designate senior responsible individuals to manage data protection risks only if their processing activities are likely to pose high risks to individuals. We are also removing the need for organisations to do detailed legitimate interest assessments and document the outcomes when their activities are clearly in the public interest—for example, when they are reporting child safeguarding concerns. This will help reduce the amount of privacy paperwork and allow businesses to invest time and resources elsewhere.
Let me make this absolutely clear: enabling more effective use of data and ensuring high data protection standards are not contradictory objectives. Businesses need to understand and to trust in our data protection rules, and that is what these measures are designed to achieve. At the same time, people across the UK need to fundamentally trust that the system works for them too. We know that lots of organisations already have good processes for how they deal with data protection complaints, and it is right that we strengthen this. By making these a requirement, the Bill helps data subjects exercise their rights and directly challenge organisations they believe are misusing their data.
We already have a world-leading independent regulator, the Information Commissioner’s Office. It is only right that we continue to provide the ICO with the tools it needs to keep pace with our dramatically changing tech landscape. The ICO needs to keep our personal data safe while ensuring that it remains accountable, flexible and fit for the modern world. We are modernising the structure and objectives of the Information Commissioner’s Office. Under this legislation, protecting our personal data will remain the ICO’s primary focus, but it will also need to consider how it can empower businesses and organisations to drive growth and innovation across the UK and support public trust and confidence in the use of personal data. We must ensure that our world-leading regulator is equipped to tackle the biggest and most important threats and data breaches, protecting individuals from the highest harm. The Bill means that the ICO can take a more proportionate approach to how it gets involved in individual disputes, not having to do so too early in the process before people have had a chance to resolve things sensibly themselves, while still being the ultimate guardian of data subjects’ rights.
The Bill will create a modern ICO that can tackle the modern, more sophisticated challenges of today and support businesses across the UK to make safe, effective use of data to grow and to innovate. It will also unlock the potential of transformative technologies by making sure that organisations know when they can use responsible automated decision-making and that people know when they can request human intervention where these decisions impact their lives.
Alongside this, there are billions of pounds to be seized in the booming global data-driven trade. With the new international transfers regime, we are clarifying our regime for building data bridges to secure the close, free and safe exchange of data with trusted allies. Alongside new data bridges, the Secretary of State will be able to recognise new transfer mechanisms for businesses to protect international transfers. Businesses will still be able to transfer data across borders with the compliant mechanisms they already use, avoiding needless checks and costs.
The Bill will allow people to control more of their data. It will support smart data schemes that empower consumers and small businesses to make better use of their own data, building on the extraordinary success of open banking, where consumers and businesses access innovative services to manage their finances and spending, track their carbon footprint or access credit. Open banking is already estimated to have the potential to bring in £12 billion each year for consumers and £6 billion for small businesses, as well as boosting innovation in our world-leading fintech industry. With this Bill, we can extend the same benefits for consumers and business across the economy.
Another way the Bill ensures that people have control of their own data is by making it easier and more secure for people to prove things about themselves. Digital identities will help those who choose to use them to prove their identity electronically rather than always having to dig out stacks of physical documents such as passports, bills, statements and birth certificates. Digital verification services are already in existence and we want to put them on a secure and trusted footing, giving people more choice and confidence as they navigate everyday tasks, and saving businesses time and money.
The Bill supports the growing demand, domestic and global, for secure and trusted electronic transactions such as qualified electronic signatures. It also makes provision for the preservation of important data for coronial investigations in the event of a child taking their own life. Any death of a child is a tragedy, and the Government have the utmost sympathy for families affected by this tragic issue. I recognise, and I share, the strong feelings on this issue expressed by noble Lords on this matter and during the passage of the Online Safety Act.
The new provision requires Ofcom, following notification from a coroner, to issue data preservation notices requiring relevant tech companies to hold data that they may have relating to a deceased child’s use of online services in circumstances where the coroner suspects that the child has taken their own life. This greatly strengthens Ofcom’s and a coroner’s ability to access data from online services and provides them with the tools they need to carry out their job. It will include, for example, if a child had taken their own life after interacting with self-harm or other harmful content online, or if they suspect that a child may have been subjected to coercion, online bullying or harassment. It would also include cases where a child has done an intentional act that has caused their death but where they may not have intended to die, such as the tragic circumstances where a child dies accidentally when attempting to recreate an online challenge.
The new provisions do not cover children’s deaths caused by homicide, because the police already have extensive investigative powers in this context. These were strengthened last year by the entry into force of the UK-US data access agreement, which enables law enforcement to directly access content of communications held by US-based companies for the purpose of preventing, detecting, investigating and prosecuting serious crimes, such as murder and child sexual abuse and exploitation.
The families who have been courageously campaigning after their children were tragically murdered did not have access to this agreement because it entered into force only last October. To date, 10,000 requests for data have been made under it. However, we understand their concerns, and the Secretary of State, along with Justice Ministers, will work with noble Lords ahead of Committee and carefully listen to their arguments on potential amendments. We absolutely recognise the need to give families the answers they need and to ensure that there is no gap in the law.
Some aspects of the GDPR are very complex, causing uncertainty around how it applies and hampering private and public bodies’ ability to use data as dynamically as they could. The Bill will help scientists make the most of data by ensuring that they can be reused for other related studies. This is achieved by removing burdensome requirements for scientific researchers, so that they can dedicate more time to focus on what they do best. The Bill will also simplify the legal requirements around research and bring legal clarity. This is achieved by transposing definitions of scientific, historical and statistical-purposes research into the operative text.
The Bill will improve the way that the NHS and adult social care organise data to deliver crucial health services in England. It will also improve the efficiency of data protection for law enforcement and national security partners, encouraging better use of personal data to help protect the public. The Bill will save up to 1.5 million hours of police time each year.
The Bill will also allow us to take further steps to safeguard our national security, by addressing risks from hostile agents seeking to access our data or damage our data infrastructure. It will allow the DWP to protect taxpayers’ money from falling into the hands of fraudsters, as part of the DWP’s biggest reform to fraud legislation in 20 years. We know that, over this last year, overpayments to capital fraud and error in universal credit alone were almost £900 million. It is time to modernise and strengthen the DWP’s legislative framework to ensure that it gives those fighting fraud and error the tools that they need and so that it stands up to future challenges.
Through the Bill we are revolutionising the way we install, maintain, operate and repair pipes and cables buried beneath the ground. I am sure we have all, knowingly or not, been impacted by one of the 60,000 accidental strikes on an underground pipe or cable that happen every year. The national underground asset register—NUAR—is a brand new digital map that gives planners and excavators secure and instant access to the data they need, when they need it. This means not only that the safety and lives of workers will no longer be at risk but that NUAR will underpin the Government’s priority to get the economy growing, expediting projects such as new roads, new houses and broadband rollout.
The Bill gives the people using data to improve our lives the certainty that they need. It maintains high standards for protecting people’s privacy, while seeking to maintain the EU’s adequacy decisions for the UK. The Bill is a hugely important piece of legislation and I thank noble Lords across the House for their involvement in and support for the Bill so far. I look forward to hearing their views today and throughout the rest of the Bill’s passage. I beg to move.
My Lords, I start with apologies from my noble friend Lady Jones of Whitchurch, who cannot be with us due to illness. We wish her a speedy recovery in time for Christmas. I have therefore been drafted in temporarily to open for the Opposition, shunting my noble friend Lord Bassam to close for us at the end of the debate. As a result, what your Lordships will now get with this speech is based partly on his early drafts and partly on my own thoughts on this debate—two for the price of one. I reassure your Lordships that, while I am flattered to be in the super-sub role, I look forward to returning to the Back Benches for the remaining stages in the new year.
I remind the House of my technology interests, particularly in chairing the boards of CENTURY Tech and EDUCATE Ventures Research—both companies working with AI in education. I very much welcome the noble Lord, Lord de Clifford, to his place and look forward to his maiden speech.
Just over six years ago, I spoke at the Second Reading of the Data Protection Bill. I said then that:
“We need to power the economy and innovation with data while protecting the rights of the individual and of wider society from exploitation by those who hold our data”.
For me, that remains the vision. We are grateful to the Minister for setting out in his speech his vision, but it feels to me that one of the Bill’s failings is the weakening of the protection from exploitation that would follow if it passes in its current form. In that 2017 Second Reading speech, I also said that:
“No consent regime can anticipate future use or the generation of intelligent products by aggregating my data with that of others. The new reality is that consent in its current form is dead”.—[Official Report, 10/10/17; cols. 183-5.]
Now that we have moved squarely into the age of AI, I welcome the opportunity to update GDPR to properly regulate data capture, storage and sharing in the public interest.
In the Online Safety Act, we strengthened Ofcom to regulate technology providers and their algorithmic impacts. In the Digital Markets, Competition and Consumers Bill, we are strengthening the Competition and Markets Authority to better regulate these powerful acquisitive commercial interests. This Bill is the opportunity to strengthen the Information Commissioner to better regulate the use of data in AI and some of the other potential impacts discussed at the recent AI summit.
This is where the Bill is most disappointing. As the Ada Lovelace Institute tells us in its excellent briefing, the Bill does not provide any new oversight of cutting-edge AI developments, such as biometric technologies or foundation models, despite well-documented gaps in existing legal frameworks. Will the Minister be coming forward with anything in Committee to address these gaps?
While we welcome the change from an Information Commissioner to a broader information commission, the Bill further weakens the already limited legal safeguards that currently exist to protect individuals from AI systems that make automated decisions about them in ways that could lead to discrimination or disadvantage—another lost opportunity.
I co-chair the All-Party Parliamentary Group on the Future of Work, and will be seeking to amend the Bill in respect of automated decision-making in the workplace. The rollout of ChatGPT-4 now makes it much easier for employers to quickly and easily develop algorithmic tools to manage staff, from hiring through to firing. We may also want to provide safeguards over public sector use of automated decision-making tools. The latter is of particular concern when reading the legal opinion of Stephen Cragg KC on the Bill. He says that:
“A list of ‘legitimate interests’ (mostly concerning law and order, safeguarding and national security) has been elevated to a position where the fundamental rights of data subjects (including children) can effectively be ignored where the processing of personal data is concerned … The Secretary of State can add to this list without the need for primary legislation, bypassing important Parliamentary controls”.
Furthermore, on lost opportunities, the Bill does not empower regulators with the tools or capabilities that they need to implement the Government’s plans for AI regulation or the commitments made at the AI Safety Summit. In this, I personally support the introduction of a duty on all public regulators to have regard to the principles on AI that were published in the Government’s White Paper. Would the Minister be willing to work with me on that?
There are other lost opportunities. I have argued elsewhere that data trusts are an opportunity to build public trust in their data being used to both develop better technology and generate revenue back to the taxpayer. I remain interested in whether personal data could be defined as an asset that can be bequeathed in one’s estate to avoid what we discussed in our debates on what is now the Online Safety Act, where bereaved families have had a terrible experience trying to access the content their children saw online that contributed to their deaths—and not just from suicide.
This takes me neatly on to broken promises and lessons not learned. I am confident that, whether the Government like it or not, the House will use this Bill to keep the promises made to families by the Secretary of State in respect of coroners being able to access data from technology providers in the full set of scenarios that we discussed, not just self-harm and suicide. It is also vital that the Bill does nothing to contradict or otherwise undermine the steps that this country has taken to keep children safe in the digital world. I am sure we will hear from the noble Baroness, Lady Kidron, on this subject, but let me say at this stage that we support her and, on these Benches, we are fully committed to the age-appropriate design code. The Minister must surely know that in this House, you take on the noble Baroness on these issues at your peril.
I am also confident that we will use this Bill to deliver an effective regime on data access for researchers. During the final parliamentary stages of the Online Safety Bill, the responsible Ministers, Paul Scully MP and the noble Lord, Lord Parkinson, recognised the importance of going further on data access and committed in both Houses to exploring this issue and reporting back on the scope to implement it through other legislation, such as this Bill. We must do that.
The Bill has lost opportunities and broken promises, but in other areas it is also failing. The Bill is too long—probably like my speech. I know that one should not rush to judgment, but the more I read the Bill and various interpretations of its impact, the more I worry about it. That has not been helped by the tabling of some 260 government amendments, amounting to around 150 pages of text, on Report in another place—that is, after the Bill had already undergone its line-by-line scrutiny by MPs. Businesses need to be able to understand this new regime. If they also have any data relationship with the EU, they potentially also need to understand how this regime interacts with the EU’s GDPR. On that, will the Minister agree to share quickly with your Lordships’ House his assessment of whether the Bill meets the adequacy requirements of the EU? We hear noises to the contrary from the Commission, and it is vital that we have the chance to assess this major risk.
After the last-minute changes in another place, the Bill increasingly seems designed to meet the Government’s own interests: first, through changes to rules on direct marketing during elections, but also by giving Ministers extensive access to the bank account data of benefit claimants and pensioners without spelling out the precise limitations or protections that go alongside those powers. I note the comments of the Information Commissioner himself in his updated briefing on the Bill:
“While I agree that the measure is a legitimate aim for government, given the level of fraud and overpayment cited, I have not yet seen sufficient evidence that the measure is proportionate ... I am therefore unable, at this point, to provide my assurance to Parliament that this is a proportionate approach”.
In starting the scrutiny of these provisions, it would be useful if the Minister could confirm in which other countries such provisions already exist. What consultation have they been subject to? Does HMRC already have these powers? If not, why go after benefit fraud but not tax fraud?
Given the lack of detailed scrutiny this can ever have in the other place, I of course assume the Government will respect whatever is the will of this House when we have debated these measures.
As we did during last week’s debate on the Digital Markets, Competition and Consumers Bill, I will now briefly outline a number of other areas where we will be seeking changes or greater clarity from the Government. We need to see a clear definition of high-risk processing in the Bill. While the Government might not like subject access requests after recent experience of them, they have not made a convincing case for significantly weakening data-subject rights. Although we support the idea of smart data initiatives such as extending the successful open banking framework to other industries, we need more information on how Ministers envisage this happening in practice. We need to ensure the Government’s proposals with regards to nuisance calls are workable and that telecommunications companies are clear about their responsibilities. With parts of GDPR, particularly those on the use of cookies, having caused so much public frustration, the Bill needs to ensure appropriate consultation on and scrutiny of future changes in this area. We must take the public with us.
So a new data protection Bill is needed, but perhaps not this one. We need greater flexibility to move with a rapidly changing technological landscape while ensuring the retention of appropriate safeguards and protections for individuals and their data. Data is key to future economic growth, and that is why it will be a core component of our industrial strategy. However, data is not just for growth. There will be a clear benefit in making data work for the wider social good and the empowerment of working people. There is also, as we have so often discussed during Oral Questions, huge potential for data to revitalise the public services, which are, after 13 years of this Government, on their knees.
This Bill seems to me to have been drafted before the thinking that went into the AI summit. It is already out of date, given its very slow progress through Parliament. There is plenty in the Bill that we can work with. We are all agreed there are enormous opportunities for the economy, our public services and our people. We should do everything we can to take these opportunities forward. I know the Minister is genuinely interested in collaborating with colleagues to that end. We stand ready to help the Government make the improvements that are needed, but I hope the Minister will acknowledge that there is a long way to go if this legislation is to have public confidence and if our data protection regime is to work not just for the tech monopolies but for small businesses, consumers, workers and democracy too. We must end the confusion, empower the regulators and in turn empower Parliament to better scrutinise the tsunami of digital secondary legislation coming at us. There is much to do.
My Lords, even less than the noble Lord, Lord Knight, can I claim that this is my primary brief, so I want to make a short Back-Bench contribution to the subject, bringing some of my experience from former interests. I declare that I do not have any current financial interests but, if you look at my register entry, you will see that I spent a long time working for a company that was so much at the heart of the data protection debate that the 2016 EU regulation was nicknamed in Brussels “Lex Facebook”.
I do not want speak to the details of the provisions in front of us, and I look forward to hearing some of the arguments, particularly from the noble Baroness, Lady Kidron, with whom I worked closely in the context of the Online Safety Act; I think she has some really important points to raise on what is in the Bill. I also look forward to the maiden speech of the noble Lord, Lord de Clifford.
The one thing I really want to spend a short amount of time on today is to flag a concern that I will not attempt to resolve: I would rather leave that to my noble friend Lord Clement-Jones and others who will to be in Committee on the Bill. It is the concern around EU adequacy that I think should really be front and centre of our discussions when we consider this legislation. As I say, I do not intend to be active in later stages of the Bill—unless we fix the NHS between now and Committee, which would be a blessing for more reasons other than enabling me to take part in consideration of data protection legislation.
The flag that I am raising will be in something of a Cassandra-like tone. It is something I think is very likely to happen, but I am not expecting the Government to believe me and necessarily change direction. I have been intimately involved in these discussions over many years. If people have been following this, they will know that the EU had an adequacy agreement with the United States that had full political support within the EU institutions but has successively been struck down in a series of actions in the European Court of Justice. All the politicians wanted data to flow freely between the United States and the EU, but the law has not allowed that to happen. So the alarm bells ring. The noble Lord, Lord Knight of Weymouth, said he thought the Commission had doubts; that worries me even more. Even where the Commission is saying that it is comfortable with the adequacy of the UK regime, the alarm bells still ring for me because it said that repeatedly over the US data transfers and it turned out not to be the case.
There are three main areas where we can predict that the risk will occur. The first is where the core legal regime for data protection in the UK is deemed to be too weak to protect the interests of EU data subjects. The second is where there are aspects of the UK legal regime for security-related surveillance that are seen as creating unacceptable risk if EU data is in the hands of UK entities. The third is where redress mechanisms for EU data subjects, especially in relation to surveillance, are regarded as inaccessible or ineffective. These are all the areas that have been tested thoroughly in the context of the United States, and any or all of them may end up being tested also in the European Court of Justice for the United Kingdom if EU citizens complain in future about the processing of their data in the UK. The first angle will test the complete package of data protection set out in the many pages of this Bill. The second will consider our surveillance practices, including new developments such as the Investigatory Powers (Amendment) Bill, which is before us right now. Any future changes to UK surveillance law, for example, following a terrorist outrage, may end up being tested and queried before the European Court of Justice.
Regarding redress, our relationship with the European Court of Human Rights is critical. Any suggestion that we start to ignore ECHR judgments, even in another area such as immigration policy, may be used to argue that EU citizens cannot rely on their Article 8 right to privacy in the United Kingdom. My advice to the Minister is to properly test all these angles internally on the assumption that we will be arguing them out at the European Court of Justice in the future. This is difficult. I know that the UK authorities, like the US authorities, will not be comfortable sharing details of their surveillance regime in a European court, but that is what will be required to prove we are adequately safe if a complaint in respect of UK surveillance is made. It is really important that we hear the strongest lines of attack, and that we invite privacy activists, in particular, to offer them: the Government should invite in the kinds of people who will be taking those court cases so they can hear their strongest lines of attack now and test all our legislation against them. We certainly should not rely on assurances from the European Commission; I hope the Minister can give us more than that in his response. The key dynamic from the transatlantic experience is that this is between EU privacy activists and the European courts, rather than being something the Commission entirely controls.
The consequences of the loss of EU adequacy, or even significant uncertainty that this is on the horizon, will be that UK businesses that work on a cross-channel basis will be advised by their lawyers to move their data processing capability into the EU. They would feel confident serving the UK from the EU, but not the other way around. This is precisely what has happened in the context of transatlantic data flows and will hardly make Britain the best place in the world to do e-business. I hope the Minister will confirm that it would be a very undesirable outcome, to use parliamentary language, and that we will be taking one step forward but two steps back if that is a consequence of this Bill.
Having planted that flag, it is regrettable I will be unable to help noble Lords as they try and thread the needle of getting the legislation right. I have every sympathy for those seeking to do that; I have less and less sympathy for the Government, because they chose to bring the legislation forward, unlike other important legislation like the mental capacity Bill, which was left off the agenda, as I keep reminding the Government. I hope noble Lords will keep this Cassandra-like warning current in their minds as they consider the Bill; I do not want to be standing here in five years’ time saying, “I told you so” and I do not think noble Lords want me here in five years’ time saying that either. With that in your Lordships’ ears, I hope the Minister and Members who are scrutinising the Bill can really dig into this adequacy point and not hold back, because it is a genuine, serious threat to all kinds of businesses in the United Kingdom, not just digital ones.
My Lords, I declare my interests set out in full on the register, including as an advisor to the Institute for Ethics in AI at Oxford University, chair of the Digital Futures for Children centre at the LSE and chair of the 5Rights Foundation. I add my welcome to my noble friend Lord de Clifford, who I had the pleasure of meeting yesterday, and I look forward to his maiden speech.
I start by quoting Marcus Fysh MP who said in the other place:
“this is such a serious moment in our history as a species. The way that data is handled is now fundamental to basic human rights … I say to those in the other place as well as to those on the Front Benches that … we should think about it incredibly hard. It might seem an esoteric and arcane matter, but it is not. People might not currently be interested in the ins and out of how AI and data work, but in future you can bet your bottom dollar that AI and data will be interested in them. I urge the Government to work with us to get this right”.—[Official Report, Commons, 29/11/23; col. 878.]
He was not the only one on Report in the other place who was concerned about some of the provisions in the Bill, who bemoaned the lack of scrutiny and urged the Government to think again. Nor was he the only one who reluctantly asked noble Lords to send the Bill back to the other place in better shape.
I associate myself with the broader points made by both noble Lords who have already spoken—I do not think I disagreed with a word that they said—but my own comments will primarily focus on the privacy of children, the case for data communities, access for researchers and, indeed, the promises made to bereaved parents and then broken.
During the passage of the Data Protection Act 2018, your Lordships’ House, with cross-party support, introduced the age appropriate design code, a stand-alone data protection regime for the under-18s. The AADC’s privacy by design approach ushered in a wave of design change to benefit children: TikTok and Instagram disabled direct messaging from unknown adults to children; YouTube turned off auto-play; Google turned on safe search on by default for children; 18-plus apps were taken out of the Play Store; TikTok stopped notifications through the night; and Roblox stopped tracking and targeting children for advertising. These were just a handful of hundreds of changes to products and services likely to be accessed by children. Many of these changes have been rolled out globally, meaning that while other jurisdictions cannot police the code, children in those places benefit from it. As the previous Minister, the noble Lord, Lord Parkinson, acknowledged, it contributes to the UK’s reputation for digital regulation and is now being copied around the globe.
I set this out at length because the AADC not only drove design change, it also established the crucial link between privacy and safety. This is why it is hugely concerning that children have not been explicitly protected from changes that lessen user data protections in the Bill. I have given Ministers notice that I will seek to enshrine the principle that children have the right to a higher bar of data protection by design and default; to define children’s data as sensitive personal data in the Bill; and exclude children from proposals that risk eroding the impact of the AADC, notably in risk assessments, automated processing, onward processing, direct marketing and the extended research powers of commercial companies.
Minister Paul Scully said at Second Reading in the other place:
“We are committed to protecting children and young people online … organisations will still have to abide by our Age-appropriate design code”.—[Official Report, Commons, 17/4/23; col. 101.]
I take it from those words that any perception of, or diminution to, children’s data rights is inadvertent, and it remains the Government’s policy not to weaken the AADC as currently configured in the Bill. Will the Minister confirm that it is indeed the Government’s intention to protect the AADC and that he is willing to work with me to ensure that it is that the outcome? I will also seek a requirement for the ICO to create a statutory children’s code in relation to AI. The ubiquitous deployment of AI technology to recommend and curate is nothing new, but the rapid advances in generative AI capabilities marks a new stage in its evolution. In the hundreds of pages of the ICO’s non-binding Guidance on AI and Data Protection, its AI and Data Protection Risk Toolkit and its advice to developers on generative AI, there is but one mention of the word “child”—in a case study about child benefit.
The argument made was that children are covered by the AADC, which underlines again just how consequential it is. However, since adults are covered by data law but it is considered necessary to have specific AI guidance, the one in three users that is under 18 deserves the same consideration. I am not at liberty to say today, but later this week—perhaps as early as tomorrow—information will emerge that underlines the urgent need for specific consideration of children’s safety in relation to generative models. I hope that the Minister will agree that an AI code for kids is an imperative rather than nice to have.
Similarly, we must deliver data privacy to children in education settings. Given the extraordinary rate at which highly personal data seeps out of schools into the commercial world, including to gambling companies and advertisers, coupled with the scale of tech adoption in schools, it is untenable to continue to see tech inside school as a problem for schools and tech outside school as a problem for regulators. The spectre of a nursery teacher having enough time and knowledge to integrate the data protection terms of a singing app, or the school ICT lead having to tackle global companies such as Google and Microsoft to set the terms for their students’ privacy, is frankly ridiculous, but that is the current reality. Many school leaders feel abandoned by the Government’s insistence that they should be responsible for data protection when both the AADC and Online Safety Act have been introduced but they benefit from neither. It should be the role of the ICO to set data standards for edtech and to ensure that providers are held to account if they fall short. As it stands, a child enjoys more protection on the bus to school than in the classroom.
Finally on issues relating to children, I want to raise a technical issue around the production of AI-generated child sexual abuse material. I recognise the Government’s exemplary record on tackling CSAM but, unfortunately, innovation does not stop. While AI-generated child sexual abuse content is firmly in scope of UK law, it appears that the models or plug-ins trained on generating CSAM or trained to generate CSAM are not. At least four laws, the earliest from 1978, are routinely used to bring criminal action against CSAM and perpetrators of it, so I would be grateful if the Minister would agree to explore the issue with the police unit that has raised it with me and make an explicit commitment to close any gaps identified.
We are at an inflection point, and however esoteric and arcane the issues around data appear to be, to downgrade a child’s privacy even by a small degree has huge implications for their safety, identity and selfhood. If the Government fail to protect and future-proof children’s privacy, they will be simply giving with one hand in the OSA and taking away with the other in this Bill.
Conscious that I have had much to say about children, I will briefly put on the record issues that we can debate at greater length in Committee. While data law largely rests on the assumption of a relationship between an individual and a service, we have seen over a couple of decades that power lies in having access to large datasets. The Bill offers a wonderful opportunity to put that data power in the hands of new entrants to the market, be they businesses or communities, by allowing the sharing of individual data rights and being able to assign data rights to third parties for agreed purposes. I have been inspired by approaches coming out of academia and the third sector which have supported the drafting of amendments to find a route that would enable the sharing of data rights.
Similarly, as the noble Lord, Lord Knight, said, we must find a route to access commercial data sets for public interest research. I was concerned that in the other place when former Secretary of State Jeremy Wright queried why a much-touted research access had not materialised in the Bill, the Minister appeared to suggest that it was covered. The current drafting embeds the asymmetries of power by allowing companies to access user data, including for marketing and creating new products, but does not extend access for public interest research into the vast databases held by those same companies. There is a feeling of urgency emerging as our academic institutions see their European counter- parts gain access to commercial data because of the DSA. There is an increased need for independent research to support our new regulatory regimes such as the Online Safety Act. This is an easy win for the Government and I hope that they grasp it.
Finally, I noted very carefully the words of the Minister when he said, in relation to a coroner’s access to data, that the Secretary of State had made an offer to fill the gap. This is a gap that the Government themselves created. During the passage of the Online Safety Act we agreed to create a humane route to access data when a coroner had reason to suspect that a regulated company might have information relevant to the death of a child. The Government have reneged by narrowing the scope to those children taking their own life. Expert legal advice says that there are multiple scenarios under which the Government’s narrowing scope creates a gaping hole in provision for families of murdered children and has introduced uncertainty and delay in cases where it may not be clear how a child died at the outset.
I must ask the Minister what the Government are trying to achieve here and who they are trying to please. Given the numbers, narrowing scope is unnecessary, disproportionate and egregiously inhumane. This is about parents of murdered children. The Government lack compassion. They have created legal uncertainty and betrayed and re-traumatised a vulnerable group to whom they made promises. As we go through this Bill and the competition Bill, the Minister will at some points wish the House to accept assurances from the Dispatch Box. The Government cannot assure the House until the assurances that they gave to bereaved parents have been fulfilled.
I will stop there, but I urge the Minister to respond to the issues that I have raised rather than leave them for another day. The Bill must uphold our commitment to the privacy and safety of children. It could create an ecosystem of innovative data-led businesses and keep our universities at the forefront of tech development and innovation. It simply must fulfil our promise to families who this Christmas and every other Christmas will be missing a child without ever knowing the full circumstances surrounding that child’s death. That is the inhumanity that we in this House promised to stop—and stop it we must.
My Lords, I too welcome the noble Lord, Lord de Clifford, and look forward to his maiden speech. We on these Benches appreciate that there is a need for updated data protection legislation in order to keep up with the many technological advances that are taking place and, wherever possible, to simplify the processes for data processing. From this perspective, we welcome the Government’s ambition to remove unnecessary red tape and to support British businesses and our economy. However, as ever, these priorities need to be balanced alongside appropriate security of new legislation and we must ensure that there are appropriate safeguards in the Bill to protect human rights that are fundamental to our democracy.
I have been struck by just how many briefing papers I have received from the most extraordinarily diverse group of organisations. One thing that many of them highlight is the fact that, for many businesses that operate between the UK and the EU, this new legislation is no guarantee of simplified data processing. In fact, with the increased divergence between UK and EU data protection that this Bill will bring, it is worrying that we may struggle to work more closely with the EU. Working to two different standards and trying to marry two frameworks that are far less aligned does not sound like less red tape, nor does it sound particularly pro-business.
However, there is an important point in respect of the stated aims of the Bill. There are serious concerns from businesses, organisations and civil society groups across a wide range of sectors about the weakening of data protection law under this new Bill. Clause 1(2) tightens the definition of personal data, meaning that only data that could allow a processor or another party to identify the individual by
“reasonable means at the time of processing”
would count as personal data and be protected by law. As many others have drawn attention to, the use of the phrase “reasonable means” is imprecise and troubling. This will need to be more clearly defined as a minimum or the clause revoked altogether. “Reasonable means” would include the cost of identifying the individual, as well as the time, effort and other factors besides. This would allow organisations to assess whether they have the resources to identify an individual, which would be an extremely subjective test, to say the least, and puts the power firmly in the hands of data processors when it comes to defining what is or is not personal data.
As an example, GeneWatch has highlighted that, under the new Bill, some genetic information will no longer be classed as “personal data” and safeguarded as such, allowing the police and security services to access huge amounts of the public’s genetic information without needing to go to court or to justify the requirement for this data. Crucially, data protection legislation should define what is or is not personal data by the type of data it is, not by how easy or feasible it may be for an organisation or third party to use that data to identify an individual at every given point. Personal data rights must continue to be protected in this country and in our law.
The new Bill also provides vastly expanded powers to the police and security services via Clause 19 and Clauses 28 to 30. As I read them, on the surface they do not look as though they provide proper accountability; perhaps the Minister can reassure me on that. Clause 19 would review the requirement in the Data Protection Act 2018 for the police to justify why they have accessed an individual’s personal data. Clauses 28 to 30 allow the Home Secretary to authorise the police so that they do not need to comply with certain data protection laws via a national security certificate; this would give the police immunity even if they commit what would otherwise be a crime.
Taken together, these two measures give an extraordinary amount of unchecked power to the police and security services. With the amended approach to national security certificates, the police could not be challenged before the courts for how and why they had accessed data, so there would be no way to review what the Government are doing here or ensure that abuses of these powers do not take place. Can the Minister explain how such measures align with the democratic values on which this country and government are based?
The National AIDS Trust has been involved in cases where people living with HIV have had their HIV status shared, without their consent, by police officers, with a huge impact on the life of the individual in question. This is a serious breach of current data protection law. We must ensure that police officers are still required to justify why they have accessed specific personal data, as this evidence is vital in cases of police misconduct.
I am aware that there are many other concerns about this Bill. Noble Lords have touched on some of them, not least around online pornography, gambling and other matters that I hope other noble Lords will pick up on. In particular, there are doubts around the Bill’s compliance with the European Convention on Human Rights. We in this House must do our duty to properly scrutinise and, wherever necessary, amend this Bill to ensure that we have the proper legislation in place to protect and safeguard our data. I look forward to working with Ministers and Members of this House when we move into Committee on this Bill.
My Lords, it is a pleasure to follow the previous speakers, including my noble friend the Minister, the other Front-Benchers and the noble Baroness, Lady Kidron.
I start by thanking the House of Lords Library for its briefing—it was excellent, as usual—and the number of organisations that wrote to noble Lords so that we could understand and drill down into some of the difficulties and trade-offs we are going to have to look at. As with most legislation, we want to get the balance right between, for example, a wonderful environment for commerce and the right to privacy and security. I think that we in this House will be able to tease out some of those issues and, I hope, get a more appropriate balance.
I refer noble Lords to my interests as set out in the register. They include the fact that I am an unpaid adviser to the Startup Coalition and have worked with a number of think tanks that have written about tech and privacy issues in the past.
When I look at the Bill at this stage, I think that there are bits to be welcomed, bits that need to be clarified and bits that raise concern. I want to touch on a few of them before drilling down—I will not drill down into all of them, because I am sure that noble Lords have spoken or will speak on them, and we will have much opportunity for further debate.
I welcome Clause 129, which requires social media companies to retain information linked to a child suicide. However, I understand and share the concern of the noble Baroness, Lady Kidron, that this seems to be the breaking of a promise. The fact is that this was supposed to be about much more data and harms to children and how we can protect our children. In some ways, we must remember the analogy about online spaces: when we were younger, before the online age, our parents were always concerned about us when we went beyond the garden gate; nowadays, we must look at the internet and the computers on our mobile devices as that garden gate. When children leave that virtual garden gate and go through into the online world, we must ask whether they are safe, in the same way that my parents worried about us when, as children, we went through our garden gate to go out and play with others.
Clauses 138 to 141, on a national underground asset register, are obviously very sensible; that proposal is probably long overdue. I have questions about the open electoral register, in particular the impact on the direct marketing industry. Once again, we want to get the balance right between commerce and ease of doing business, as my noble friend the Minister said, and the right to privacy.
I have concerns about Clauses 147 and 148 on abolishing the offices of the Biometrics Commissioner and the Surveillance Camera Commissioner. I understand that the responsibilities will be transferred, but, in thinking about the legislation that we have been talking about in this place—such as the Online Safety Act—I wonder about the amount of powers that we are giving to these regulators and whether they will have the bandwidth for them. Is there really a good reason for abolishing these two commissioners?
I share the concerns of the noble Lord, Lord Knight, about access to bank accounts. Surely people should have the right to know why their bank account has been accessed and have some protection so that not just anyone can access it. I know that it is not just anyone but there are concerns about this, and people have to be clearer on the rules.
I have talked to the direct marketing industry. It sees the open electoral register as a valuable resource for businesses in understanding and targeting customers. However, it tells me that a recent court case between Experian and the ICO has introduced some confusion on the use of the register for business purposes. It is concerned that the Information Commissioner’s Office’s interpretation, requiring notification to every individual for every issue, presents challenges that could cost the industry millions and make the open electoral register unusable for it, perhaps pushing businesses to rely more on large tech companies. However, I understand that, at the same time, this may well be an issue where there are clear concerns about privacy.
Where there is no harm, I would like to understand the Government’s thinking on some of that—whether it is going too far or whether some clarification is needed in this area. Companies say they will be unable to target prospective customers; some of us may like that, but we should also remember that there is Clause 116 on unlawful direct marketing. The concern for many of us is that while it is junk if we do not want it, sometimes we do respond to someone’s direct marketing. I wonder how we get that balance right; I hope we can tease some of that out. If the Government agree with the interpretation and restrictions on the direct marketing industry, I wonder whether they can explain some of the reasons behind it. There may very well be good reasons.
I also want to look at transparency and data usage, not just for AI but more generally. It is obvious in the Government’s own AI White Paper that we want a pro-innovation approach to regulation, but we are also calling for transparency at a number of levels: of datasets and of algorithms. To be honest, even if we are given that transparency, do we have the ability to understand those algorithms and datasets? We still need that transparency. I am concerned about undermining the principle, and particularly weakening subject access requests.
I am also interested in companies that, say, have used your data but have refused an application and then tell you that they do not have to tell you why they refused that application. Perhaps this is too much of a burden to companies, but I wonder whether we have a right to know which data was being accessed when that decision was made. I will give a personal example; about a year ago, I applied for an account with a very clever online bank and was rejected. It told me I would have a decision within 48 hours; I did not. Two weeks later, I got a message on the app that said I had been rejected and that under the law it did not have to tell me why. I wrote to it and said, “Okay, you don’t have to tell me why, but could you delete all the data you have on me—what I put in?”. It said, “Oh, we don’t have to delete it until a certain time”. If we really own that data, I wonder whether there should be more of an expectation on companies to explain what data and information they have to make those decisions, which can be life changing for many people. We have heard all sorts of stories about access to bank accounts and concerns about digital exclusion.
We really have to think about how much access individuals can have to the data that is used to refuse them, but also the data when they leave a service or stop being a user. I also want to make sure that there is accountability. I want to know, in Clause 12, about “reasonable and proportionate search”; what does that mean, particularly when it is processed by law enforcement and intelligence services? I think we need further clarification on some of this for our assurance.
We also have to recognise that, if we look at the online environment of the last 10, 15 or 20 years, at first we were very happy to give our data away to social media companies because we thought we were getting a free service, connecting with friends across the world et cetera. Only later did we realise that the companies were using this data and monetising it for commercial purposes. There is nothing wrong with that in itself, but we have to ask whose data it is. Is it my data? Does the company own it? For those companies that think they own it, why do they think that? We need some more accountability, to make sure that we understand which data we own and which we give away. Once again, the same thing might happen—you might stop being a user or customer of a service, or you might be rejected, but it is not there.
As an academic, I recognise the need for greater access to data, particularly for online research. I welcome some of the mechanisms in the Online Safety Act that we debated. Does my noble friend the Minister believe that the Bill sufficiently addresses the requirements and incentives for large data holders to hold data for academic research with all the appropriate safeguards in place? I wonder whether the Minister has looked at some of the proposals to allow this to happen more, perhaps with the information commission acting as an intermediary for datasets et cetera. Once again, I am concerned about giving even more power to the information commission and the bandwidth to do all this stuff, including all the powers we are giving.
On cookie consent, I understand the annoyance of cookies. I remember the debates about cookie consent when I was in the European Parliament, but at the time we supported it because we thought it was important for users to be told what was being done with their information. It has become annoying, just like those text messages when we go roaming; I supported that during the roaming debates in the European Parliament because I did not want users to say they were not warned about the cost of roaming. The problem is that they become annoying; people ignore them and tick things on terms and conditions without having read them because they are too long.
When it comes to some of the cookies, I like the idea about exemptions for prior consent—a certain opt-out where there is no real harm—but I wonder whether it could be extended, for example so that cookies to understand the performance of advertising and to help companies understand the effectiveness of advertisements are exempt from the consent requirements. I do not think this would fundamentally change the structure of the Bill, but I wonder whether we have the right balance here on harm, safety and the ability of companies to test the effectiveness of some of their direct marketing. Again, I am just interested in the Government’s thinking about the balance between privacy and commerce.
Like other noble Lords, I share concerns about the powers granted to the Secretary of State. I think they lack the necessary scrutiny and safeguards, and that there is a risk of undermining the operations of online content and service providers that rely on these technologies. We need to see some strengthening here and more assurances.
I have one or two other concerns. The Information Commissioner has powers to require people to attend interviews as part of an investigation; that seems rather Big Brother-ish to me, and I am not sure whether the Information Commissioner would want these abilities, but there might be good reasons. I just want to understand the Government’s thinking on this.
I know that on Report in the other place, both Dawn Butler MP and David Davis MP raised concerns about retaining the right to use non-digital verification systems. We all welcome verification systems, but the committee I sit on—the Communications and Digital Committee—recently wrote a report on digital exclusion. We are increasingly concerned about digital exclusion and people having a different level of service because they are digitally excluded. I wonder what additional assurances the Minister can give us on some of those issues. The Minister in the other place said:
“Individual choice is integral … digital verification services can be provided only at the request of the individual”.—[Official Report, Commons, 29/11/23; col. 913.]
I think that any further verification would be really important.
The last point I turn to is EU adequacy. Let me be quite clear: I do not believe in divergence for the sake of divergence, but at the same time I do not believe in convergence or harmonisation for the sake of convergence and harmonisation. We used to have these debates in the European Parliament all the time. There are those expressing concerns about EU data adequacy, and we have to split them into two groups—one is those people who really still wish we were members of the EU, but there are also those for whom this is irrelevant, and for whom this really is about the privacy and security of our users. If the EU is raising these issues in its agreements, we can thank it for doing that.
I obviously was involved in debates on the safe harbour and the privacy shield. As noble Lords have said, we thought we had the right answer; the Commission thought we had the answer, but it was challenged by courts. I think this will have to be challenged more. Are we diverging just for the sake of divergence, or is there a good reason to diverge here, particularly when concerns have already been raised about security and privacy?
I end by saying that I look forward to the maiden speech of the noble Lord, Lord de Clifford. I thank noble Lords for listening to me, and I look forward to working with noble Lords across the House on some of the issues I have raised.
My Lords, the Bill may contain some good elements in the search for a modernisation of data protection, but in overall terms it seems to tilt the balance of advantage to businesses and government authorities rather than to the individual. It has been marred in its passage by the profusion of late government amendments in the other place on Report, and an absence of scrutiny from the Joint Committee on Human Rights.
There are a number of issues that I think need to be seriously reconsidered. I will focus today on four. I also commend the passion of the noble Baroness, Lady Kidron, on the issues that she raised, some of which I will also touch on.
First, as my noble friend Lord Knight of Weymouth and the noble Lord, Lord Allan of Hallam, said—I do love the noble Lord’s name; alliterative Peers are a wonderful thing—a number of proposals appear to put at risk the free flow of data from the UK to the EU. That has already been touched on. It could even undermine the UK’s data adequacy decision. There seems to be some disconnect between what the EU Commission and the EU Parliament have begun to enunciate as a view: that the new powers of the Secretary of State to meddle with the objective and impartial functioning of the new Information Commission could result in the withdrawal of the UK adequacy decision. There seems to be a disconnect between that and the assurances that Ministers have given so far in the other place. Losing that decision, or even seeming to have that decision at risk, would be pretty disastrous for UK business, our trade and our research collaborations. Can the Minister tell the House how he intends to avoid this in the review due next year? How does he square the concerns of the EU with the assurances given by his ministerial colleagues?
My second point is about the new measures introduced at the last minute in the other place—Clauses 128 and Schedule 11—requiring the banks to monitor continuously all accounts to find welfare recipients and snitch on them if they reach certain as yet unprescribed criteria. This is not just an abstruse issue; it involves a considerable number of people. Knowing the age of the average Peer, it probably involves pretty well everybody in this House, because, of course, it includes pension recipients, so this is of personal concern to all of us. This is legitimising mass surveillance by algorithm. This seems to me to be a major intrusion into the privacy of pretty well all individuals in the UK and, to some extent, an infringement on the confidential relationship that you ought to be able to expect between a bank and its customer.
Can the Minister tell the House why he thinks this Big Brother mechanism is necessary? Why can the problem of benefit fraud not be dealt with in a way that does not mean that all customers are subject to surveillance? What alternatives were considered by Government and rejected? What safeguards will go alongside this provision to prevent it from being typified as a heavy-handed Big Brother approach?
It is strange that pension claimants are included. A pension, in my view, is a right, not a benefit; it was paid for by hard work during one’s working life. The Minister said in another place that they intend to extend this sort of surveillance process to other data areas. Can the Minister tell us what other areas and when that extension might take place?
The third issue is AI safety, an issue that has already been raised by a number of noble Lords. The Government were quite bushy tailed about their recent AI Safety Summit and the commitment to see the UK as a world leader. I am afraid that every time I hear this phrase “a world leader” I have the urge to throw up in my handbag, so you will pardon me if I wrinkle my nose at that. The fact that we want to be somewhere in the front pack on AI safety and responsible and safe AI innovation is okay, but the Bill is a missed opportunity. I agree with my noble friend Lord Knight of Weymouth that the Bill should be the place where oversight challenges posed by a very fast-moving set of AI developments, such as in biometric technologies, needs to have been gripped.
I was a victim of a biometric technology development when I was chancellor of Cranfield University. It developed a process for detecting microscopic and invisible beads of sweat above your eyebrows if you were put under pressure, and it was to be used in cases of airport security and various other areas. They decided to put me under pressure by making me stand in the main square of the university and answer mental arithmetic questions over a loudspeaker. What they had not quite grasped is that I know I am rubbish at mental arithmetic, so it put me under no pressure whatever, because this was not going to be news to anybody. It therefore failed to detect microscopic sweat. I thought you might like the day to be raised by a humorous account in this pre-Christmas process.
The Bill is a real missed opportunity to grasp those AI developments and the safeguarding that needs to go with them. In fact, you could say that it erodes further the already inadequate legal safeguards that should protect individuals from discrimination or disadvantage by AI systems making automated decisions. We have heard about job hiring and loan applications; this is, “The computer says no”, but on speed. We in your Lordships’ House deplore late additions to Bills, although we have rather grown used to it in recent months, but if the summit’s assurances are not going to seem a bit hollow, it would be good to hear whether the Minister intends to introduce additional measures on AI safety in the Bill and, if not, in what other legislation and to what timescale.
The fourth issue I want to raise is that of the role of the Information Commissioner’s Office, soon to be the Information Commission. I entirely approve of the structure of an information commission as opposed to a commissioner. We need a powerful and effective regulator. The ICO’s enforcement and prosecution record has not been sparkling, with low levels of enforcement notices, prosecutions and fines. If, when I was at the Environment Agency, I had had as low a level of those as the Information Commissioner has had, I would think I had gone to sleep somewhere along the line. Does the Minister acknowledge that improvements need to be made to the Bill to ensure that the new Information Commission has a clear statutory objective and is clearly independent and at arm’s length from government, not the sort of arm’s length that becomes very short in times of crisis, that its regulatory function at a judicial level can be effectively scrutinised, that it retains the office and surveillance camera commission rather than simply wiping them from the script, and that it is able to consider class action complaints brought by civil society organisations or the trade unions?
In my experience, all too often, Governments plural, not just the current Government, establish watchdogs, then act surprised when they bark, and go and buy a muzzle. If the public are to have trust in our digital economy, we need a robust independent watchdog with teeth that government responds to. The Bill will need a lot of work, and there are hours and hours of happy fun in front of us. I look forward to the Minister’s response to my questions and to those of other noble Lords. I also look forward to the maiden speech of the noble Lord, Lord de Clifford.
It is two months since I took my oath in this esteemed Chamber, and every day since I have been grateful to your Lordships for the unique opportunity that has been granted to me. Since that first day, I have been asked on many occasions by friends and colleagues, “How is it going?” My reply: “It is like being back at senior school”. I feel very junior, but that is a nice thing, and I feel quite young too.
Being a new Peer, at times I look around and feel overwhelmed by the wealth of knowledge and depth of experience that your Lordships express in the Chamber and outside. I have been made to feel most welcome and supported, especially today in this debate with your kind word of support, but also by the doorkeepers with their immense knowledge of the workings of the House, its history and keeping me on the right side of its traditions and customs.
I would also like to mention the Convenor of the Cross Benches’ office staff, who have encouraged and guided me to this point, and to the many other staff in the Palace who have made me feel so much part of this grand establishment. Finally, if you will indulge me, thank you to my wife and family, who are here today to support me.
Whenever you start a new opportunity, you always question where you can contribute. For me, it was today’s debate on data protection. It would appear that I do not have in-depth knowledge of this extraordinarily complex subject—but on reflection I do, given my experience over the past 30 years of small business. I started with farming businesses, where I was part of the accountancy team, and then I ran the business side of a small firm of rural chartered surveyors. For the past 15 years I have managed a large independent veterinary practice which provides care and services to pets, horses and a large range of farming businesses. I know how important it is that we understand that the data we hold and care for on behalf of our customers and clients is important.
It is five years since the original GDPR legislation was introduced. At that time, it caused a significant amount of anxiety within the small business and veterinary world. This was reflected in the number of individuals and businesses attending seminars on the GDPR, put on by the Veterinary Practice Management Association, an organisation of which I am proud to be the current president. It promotes management and leadership, which are also a passion of mine, in the veterinary sector. The revision of this Bill is extremely well timed and needed. SME businesses are comfortable with the processes they have in place today to comply with the current legislation, but in the fast-moving and changing IT world, the simplification and clarity in the rules with regard to the use of data on a legitimate basis which this Bill intends to clarify are welcome.
Nearly all small businesses, from sole traders to large owner-managed companies, are data controllers. All collect personal data of some form in sales databases, client and patient relationship software and accountancy packages. The ability of the business to keep control of this data is becoming harder, as it has never been easier to export substantial amounts of data from these systems for many different purposes. Therefore, there is an increased risk that personal data can be lost or stolen due to the ever-increasing threat of cyberattack. It is essential that this updated legislation takes into account where all data is stored and its many different formats and ensures that it is not unknowingly shared with other users.
As my research for this debate has shown me, this Bill is immensely complex, which I know is required—but I fear that its complexity will mean that it will not be fully complied with by a number of small to medium-sized businesses that do not have the resources or time to research and instigate any changes that may be required. Therefore, investment will be needed from government to publicise the changes in a simple and understandable way to SMEs. If the Minister will say how he intends to communicate these changes to the sector, that would be welcome.
With regard to the section on smart data, this has brought immense efficiencies and security for small businesses with the changes made by the banking sector. Extending it further would bring more efficiencies for the business community. A cautious approach is needed when extending the use of smart data to ensure that businesses sharing and receiving personal data are compliant with these complex regulations, so that open application program interfaces cannot be infiltrated or hacked.
Individual personal data has without doubt grown in value significantly over the past five years since the introduction of the original data protection legislation. The desire to exchange of data between businesses, scientific institutions and government will only improve efficiency, productivity and scientific breakthroughs, which is one of the goals of this legislation. The protection of the data and recognising its value is essential as we review the Bill. Potentially, as it currently stands, the Bill could favour large IT corporations, whose ability to collect, process and monetise data is well known, so we must ensure that the new up-to-date regulations do not require large amounts of resources to implement them, so that we can ensure a level playing field for all businesses so that they can benefit from the power of data analysis. I agree with the noble Lord, Lord Allan of Hallam, on the need to access EU data so that small businesses can continue to trade without too much hassle and burden. I look forward to learning more of the way of the House as I continue to contribute to this Bill as it moves to Committee stage.
My Lords, it is a great pleasure to follow my noble friend Lord de Clifford and to congratulate him on an excellent and insightful maiden speech. I am pleased that he has chosen this important Bill for this occasion. Data protection is something of a minority sport and it is great to add another person to the select group in this Chamber.
Data protection is about finding the right balance between protecting individuals’ privacy and the bureaucracy and costs that go with it, for small businesses and others. My noble friend’s long experience in managing small and medium-sized businesses gives him great insight into how these regulations will impact the businesses that typically find it most difficult to deal with greater bureaucracy, as he so rightly pointed out. SMEs are often overlooked more generally, so having such an experienced voice to remind us of their importance during our deliberations will be a great asset to the House, and from a personal point of view it is a great pleasure to welcome a fellow finance professional to join us.
The noble Lord’s experience in the veterinary sector should also be of enormous value to the House. I hope that my noble friend Lord Trees will not mind having his monopolistic position in the field broken. It seems that the noble Lord has also been hiding another light under a bushel: I believe that he has also competed for Great Britain in equestrianism, so he is clearly a man of many talents. I tried to find a joke to do with horsing around, but I am afraid that inspiration completely deserted me. I—and, I am sure, all noble Lords—look forward to his future contributions, both on this Bill and more widely.
I turn now to the specifics of the Bill. As I mentioned, data protection is about finding the right balance between individual privacy and the costs, processes and rules that must be in place, alongside the ability to carry out essential criminal investigations and national security. I think it is generally agreed that the GDPR has its flaws, so an effort to look again at that balance is welcome. There is much in the Bill to like. However, there are a number of areas where the Bill may move the balance too far away from individual privacy, as a number of other noble Lords have already mentioned. In fact, there is not much that I have disagreed with in the speeches so far.
It is a long and very complex Bill; the fact that the excellent Library briefing alone runs to 70 pages says a lot. It will not be possible to raise all issues; noble Lords are probably grateful for that. I am going to concentrate on four areas where I can see significant risks, but the Minister should not take that as meaning that I disagree with other things that have been said so far; I agree with almost everything that has been raised.
First, a general concern raised a number of times, in particular by the noble Lord, Lord Allan, is that the Bill moves us significantly away from our existing data protection rules, which were based clearly on the EU regulations. We are currently benefiting from an EU data adequacy ruling which allows data to be transferred freely between the EU and the UK. This was a major concern at the time of the Brexit discussions. At that time, data adequacy was not a given. This ruling comes to an end in July 2025, but it can be ended sooner if the EU considers that our data protection rules have diverged too far.
The impact assessment for the Bill—another inch-thick document—says:
“Cross-border data transfers are a key facilitator of international trade, particularly for digitised services. Transfers underpin business transactions and financial flows. They also help streamline supply chain management and allow business to scale and trade globally”.
It is good that the impact assessment recognises that. The loss of data adequacy would therefore have significant negative impacts on trade and on the costs of doing business. Without it, alternative and more costly methods of transferring data would be required, such as standard contractual clauses. There are also implications for investment, as the noble Lord, Lord Allan, pointed out. Large international financial services organisations would be much less likely to establish data processing activities in the UK if we were to lose data adequacy. Indeed, they may decide that it is worth moving their facilities away from here.
The impact assessment suggests surprisingly low costs that might arise: one-off costs of £190 million to £460 million, and annual lost trade of £210 million to £420 million. However, these are only the direct reduction in trade with the EU; as the impact assessment points out, they will likely be larger when taking into account interactions with onward supply chains.
The impact assessment does not judge the probability of losing the data adequacy status. I find that rather extraordinary, possibly even shocking, as it is so important. The New Economics Foundation and UCL conservatively estimate the cost of losing data adequacy at £1 billion to £1.6 billion; however you look at it, these are very large numbers.
What can the Minister tell us that could set our minds at rest? What discussions have taken place with the EU? What initial indications have been received? What changes have been made to the original draft Bill to take account of concerns raised by the EU around data adequacy? What is the Government’s assessment of this risk? The Bill has been on the blocks for a long time now. I have to assume that a responsible Government must have had discussions with the EU around data adequacy in relation to these proposals.
Secondly, as we have heard, Clause 129 would enable Ofcom to require social media companies to retain information in connection with an investigation by a coroner into the death of a child, where the child was suspected to have died by suicide. This is a welcome addition but, as we have heard, it does not go far enough. It does not include all situations where a death was potentially related to online activity; for example, online grooming. My noble friend Lady Kidron has, as always, covered this with much greater eloquence than I could. I suspect the Minister already knows that the Government have got this wrong. As the noble Lord, Lord Knight, pointed out, it would be a brave Minister who tried to hold the current line in the face of opposition from my noble friend. I welcome the words that the Minister said at the beginning of this debate—that he is willing to engage on this matter. I hope that engagement will be constructive.
Thirdly, the Bill introduces draconian rules that would enable the DWP to access welfare recipients’ personal data by requiring banks and building societies to conduct mass monitoring without any reasonable grounds for suspecting fraudulent activity. As the noble Baroness, Lady Young, pointed out, this includes anyone receiving any kind of benefit, including low-risk benefits such as state pensions, so, as she has pointed out, most noble Lords will be subject to this potential intrusion into their privacy—although, fortunately, not me yet. The Government argue that this power is required to reduce levels of benefit fraud. My enthusiasm to tackle fraud is well known, but the Government already have powers to require information where they have grounds to suspect fraudulent behaviour. This new power, effectively enabling them to trawl any bank account with no grounds at all, is a step too far, and constitutes a worrying level of creep towards a surveillance society.
That brings me neatly on to my fourth concern, which the noble Lord, Lord Kamall, raised earlier. The Bill will abolish the post of Biometric and Surveillance Camera Commissioner—currently it is one person—as well as the surveillance camera code. It was interesting that the Minister did not mention this in his opening speech. It is extremely important.
The Government argue that these functions are covered elsewhere or would be moved elsewhere—for example, to the ICO—but that does not seem to be the case. An independent report by the Centre for Research into Information, Surveillance and Privacy, commissioned by the outgoing commissioner, sets out a whole range of areas in which there will be serious gaps in the oversight of handling biometric data and, in particular, the use of surveillance cameras, including facial recognition.
The independent report concludes that none of the Government’s arguments that the functions are adequately covered elsewhere “bear robust scrutiny”. It notes in particular that the claim that the Information Commissioner’s Office will unproblematically take on many BSCC functions mistakes surveillance as a purely data protection matter and thereby limits
“recognition of potential surveillance-related harms”.
Given the ever-widening use of surveillance in this country, including live and retrospective facial recognition, and the myriad other methods of non-facial recognition being developed, such as gait recognition or, as I was reading about this morning, laser-based cardiac recognition—it can read your heartbeat through your clothing—alongside the ability to process and retain ever greater amounts of data and the emerging technology of AI, having clear rules on and oversight of biometrics and surveillance is more important than ever. We see how the misuse of surveillance can go—just look at China. Imagine, for example, if this technology, unfettered, had been available when homosexuality was illegal. Why do the Government want to remove the existing safeguards? With the advances in technology, surely these are more important than ever. We should be strengthening safeguards, not removing them.
The outgoing commissioner—if the Government get their way, the last surveillance camera commissioner —Professor Sampson, put it best:
“There is no question that AI-driven biometric surveillance can be intrusive, and that the line between what is private and public surveillance is becoming increasingly blurred. The technology is among us already and the speed of change is dizzying with powerful capabilities evolving and combining in novel and challenging ways … The planned loss of the surveillance camera code is a good example of what will be lost if nothing is done. It is the only legal instrument we have in this country that specifically governs public space surveillance. It is widely respected by the police, local authorities and the surveillance industry in general. It’s one of those things that would have to be invented it didn’t already exist, so it seems absolutely senseless to destroy it now, junking the years of hard work it took to get it established”.
These are just four of the areas of concern in the Bill. There are many more, as we have heard. In the other place, following the failure of the recommittal Motion after all the new amendments were dropped in at the last minute, David Davis MP said that the Commons had
“in effect delegated large parts of the work on this important Bill to the House of Lords”.—[Official Report, Commons, 29/11/23; col. 888.]
That is our job, and I believe that we do it well. I hope the Minister will engage constructively with the very genuine concerns that have been raised. We must get this Bill right. If we do not, we risk substantial damage to the economy, businesses, individuals’ privacy rights—especially children—and even, as far as the surveillance elements go, to our status as a free and open democratic society.
My Lords, I have now reached the grand old age of 71, and it is a worrying fact that I think this puts me bang on the average age of those in your Lordships’ House. So, it is a huge relief to be able to welcome to this House the two Peers, such young Peers, who have preceded me. I echo what the noble Lord, Lord Vaux, said, and I find myself in agreement with him, in that I have agreed with most of what has been said in this debate so far. I also echo his welcome to the noble Lord, Lord de Clifford, who brings real front-line experience of the effects of what we do in this House on small and medium-sized enterprises. He is someone that I know noble Lords will want to hear from in the years to come—and in view of his age, we can look forward to very many of them.
I declare my interest as chairman of the advisory panel of Thales, a digital company, and a member of the Post Office Horizon Compensation Advisory Board. I have learned in relation to the Post Office scandal that the complexity of computers is such that nobody really fully understands exactly what programs will do, so it is absurd that there is still in law a presumption that computers will operate as they are intended to. I hope that noble Lords will be able to turn their minds to changing that in the relatively near future.
I can be brief, because I was intending to raise issues relating to privacy, cookies and information which have already been so well canvassed by my noble friend Lord Kamall. Currently, we have to consent to cookies and terms and conditions, but we do not read them, we do not understand them, we do not know their effect—we do not have time. We will do anything for convenience, so the consent that we give is neither informed nor freely given. My noble friend Lord Kamall said what I wanted to say about an open electoral register. The thought of sending paper letters to everyone to inform them about the use of their data seems disproportionate and I, too, would like to know what on earth the ICO is thinking of in demanding such notification to everybody in the Experian case. I also adopt his questions about exemptions from getting consent to cookies when they are purely functional and non-intrusive. But there is no need for me to say it again, so I will not.
My Lords, on behalf of these Benches, I too welcome the noble Lord, Lord de Clifford. I pay tribute to his maiden speech and thank him for his insightful and valuable contribution to this debate. I also look forward to many future occasions on which he will contribute to the work of this House.
As the right reverend Prelate the Bishop of St Albans has said, we on these Benches recognise that high-quality data is crucial to creating and sustaining a healthy and efficient society. However, it is vital to get the balance right between ownership, access, control, and legitimate use of that data. Human flourishing should be at the front of regulating how data is used and reused. As we said in our written response to the Government’s 2020 data consultation:
“Fundamentally, the church welcomes any technology that augments human dignity and worth, while staunchly resisting any application of data that undermines that dignity. Questions of efficiency and cost-effectiveness are subsidiary to questions about how the types and uses of data will promote human flourishing in society and best practice in public bodies”.
It seems that the real test of this legislation is how it will truly promote good democracy and the extent to which it will protect the safety and enhance the security of the most vulnerable in our society. I hope the House will permit me a brief seasonal reference in pointing out that it was, in fact, a comprehensive data collection exercise by Quirinius, motivated entirely by greed and an abuse of power, that first resulted in the Holy Family travelling to Bethlehem. It also meant that they would need to flee very quickly indeed when the Christ child’s identity and location came to the attention of an insecure leader with unregulated power who also had exclusive access to the data, albeit in a very ancient form.
We acknowledge that current provision for data regulation is also outdated and in urgent need of reform. We support the Government’s intention to reform the Information Commissioner’s Office while preserving its independent footing, and the introduction of an information commission. But it is interesting to compare the Bill before us today with the concerns we expressed in 2020. First, our goal then was
“to flag some of the more significant risks we foresee in using data without adequate reflection on the pitfalls and harms that hasty and ill-considered data use gives rise to”.
It is sobering, therefore, that the Bill arrives in this House substantially amended in ways the other place has had insufficient time to scrutinise. The Online Safety Act perhaps offers a valuable and recent template for how this House might examine and improve this important Bill.
Secondly, we said we acknowledged the benefits of data but also the importance of gaining and retaining public trust. Therefore, it is worrying that, with some of the measures in the Bill, the Government seem to be reducing the levers and mechanisms that public trust depends upon. The Public Law Project’s assessment is that:
“While the Bill does not outright remove any of the current protections in data protection law, it weakens many of them to the extent that they will struggle to achieve their original purposes”.
We share the concerns of many civil society groups that the Bill will reduce transparency by weakening the scope of subject access requests, although I welcome the concern to mitigate plainly vexatious complaints. In June, the chief executive of the Data Protection Officer Centre said:
“Whilst countries across the globe are implementing ever-more robust data protection legislation, the UK seems intent on going in the opposite direction and lowering standards”.
What reassurance can the Minister give the House that the Bill will retain public trust and will not diverge even from current adequacy agreements?
Thirdly, we emphasised the Nolan principles as an aid to the public use of data. On 6 December 2023, the Public Accounts Committee in the other place published a report that noted that the DWP is piloting the use of machine-learning algorithms to identify potentially fraudulent claims. We are all in favour of proportional and effective measures to counter fraud, but Big Brother Watch argues that it is
“wholly inappropriate for the UK Government to order private banks, building societies and other financial services to conduct mass, algorithmic, suspicionless surveillance and reporting of their account holders on behalf of the state”.
Will the Minister explain how the state demanding data without cause—including, as a number of Members pointed out, data on the bank accounts of recipients of the state pension that it itself says it has no intention of using—complies with the Nolan principles of openness and accountability? Is this not at risk of being an overreach of government into people’s private lives?
His Majesty’s Government made commitments at the recent AI Safety Summit to make the UK a world leader in safe and responsible AI innovation, so would we not expect that the Data Protection and Digital Information Bill would provide oversight of biometric technologies and general purpose artificial intelligence? My colleague the right reverend Prelate the Bishop of Oxford regrets that he is unable to participate in the debate today, but he will again lead for us as we scrutinise the Bill more thoroughly, including its gaps in protecting children’s data and in the regulation of data use by AI foundation or frontier models.
Regarding the latter, an important failure to interlock regulation persists. As the BBC reported over the weekend, assurances given in this House during the passage of the Online Safety Act are being threatened. The draft amendment grants access to data only where children have taken their own lives. This is not what the Government promised on the record in either the Commons or the Lords, and we will continue to press for a proper resolution. Surely we cannot simply rely on other holders of important data to disclose information that is important in order to protect children’s well-being.
I will comment briefly on death registration. The ability to move from a paper to an electronic register is commended. However, the UK Commission on Bereavement, chaired by my colleague the right reverend Prelate the Bishop of London, has recommended more that could be done to reduce the administrative burden on bereaved people. The Tell Us Once system is designed so that someone reporting a death need do so only once, and the information is then shared with the relevant public services. Currently, bereaved people must still notify private companies of a death separately. Can the Government please review the system to see whether this burden could be lessened? I would be grateful if the Minister could clarify how the extended priority service register announced in the Autumn Statement will work alongside Tell Us Once. In addition, do the Government have any plans to undertake an updated equality impact assessment of Tell Us Once, given that the last one was 12 years ago?
We look forward to working with everyone in this House to carefully understand and, where appropriate, strengthen an important Bill for the future flourishing of the country and the well-being of all.
My Lords, I very much welcome the maiden speech of the noble Lord, Lord de Clifford. As one who entered this House in his early 50s, I can recommend that coming in here, just as the mid-life crisis starts to bite, and being, as I was then, Young Tom again, is a great boost to the morale.
I associate myself with the advice given by the right reverent prelate the Bishop of Southwell and Nottingham. At the end of the recent passage of the Online Safety Bill, there was general thanks to the noble Lord, Lord Parkinson of Whitley Bay, the Minister guiding the Bill safely through the Lords, for his willingness to listen to argument and to amend where necessary. I fear that the noble Viscount will hit some choppy water in this House unless he adopts a similar attitude, and he should certainly take the noble Baroness, Lady Kidron, very seriously concerning children’s data rights.
The Government’s declared intention of reducing burdens on organisations while maintaining high data protection standards has met with scepticism and outright criticism from a wide range of industry bodies, civil society organisations and individuals with expertise in this area. As has been said, the Official Opposition in the other place asked that the Bill be recommitted to a Public Bill Committee for further scrutiny, but this was refused. As the noble Baroness, Lady Young, indicated, this has put further onus on this House to make sure there is time to listen to and examine the wide range of criticisms and amendments seeking to improve the Bill.
In 2010, I became Minister of State at the Ministry of Justice. Among my responsibilities was the ICO and the early negotiations on what became the GDPR. One of my first roles was to go to a facility south of the river to look at our skills in this area. After looking at a number of things, I asked the government official who was showing me the facility whether there were any human rights or privacy issues involved. He said, “Oh no, sir. Tesco knows more about you than we do”. There is a certain profligacy by the individual about their data, along with real concern about their privacy. It is riding those two horses at once that is going to be the challenge of this Bill. I oppose the Bill with an eye to ensuring, like the noble Baroness, Lady Young, that the ICO is well served by this legislation and continues in setting standards and protecting individuals.
Prior to Brexit, I was on one of your Lordship’s sub-committees, where we constantly pressed the Ministers about data adequacy with the EU on our departure. The answers then were very much along the lines of, “Well, it’ll be alright on the night”. I hope that the Minister will again reassure us in his wind up that the data protection legislation in the Bill clarifies the law without deviating from the principles set out in GDPR. The UK’s data adequacy status, granted by the European Commission, is important, and we do not want to see that jeopardised in pursuit of some mythical benefits from Brexit.
I am sorry that my noble friend Lord Allan will not be joining us for the rest of this; I would have valued his contribution. But I will keep an eye on it, as a number of other colleagues have indicated.
More widely, one of the problems with this Bill is that its scale and how it has been dealt with by the Government in its preparation, false starts and in the other place mean that we are going to legislate for myriad issues, each of which are of importance to the sector, the individual concerned or society and will require our full due care and attention. For example, new powers in Clause 87 and 88, which allow the Secretary of State to offer an exemption for direct marketing provisions used for the purpose of democratic engagement, may invite abuse. I put that mildly. This morning’s FT contains an article raising precisely these fears and this issue must be examined in detail during the passage of the Bill.
One issue that I was going to deal with in detail was referred to by the noble Lord, Lord Kamall. The Minister might, even at this early stage in the Bill’s progress, provide clarification about the use of the open electoral register for direct marketing purposes. This issue has also been raised with me by the Data & Marketing Association. As the noble Lord, Lord Kamall, explained, there are big concerns in the market about what companies can do with personal data from the open electoral register and this needs to be resolved.
Unfortunately, considerable market uncertainty has been caused by the enforcement notice by the ICO, which has already been referred to. In the light of all this legal and market uncertainty, and given that this Bill is before the House, the best and most timely option is to address the issue in the Bill and I urge the Government to consider what can be done on this. Perhaps the noble Lord, Lord Kamall, and other noble Lords could discuss a joint amendment.
That is just one example of the issues in the Bill that will require detailed examination and close attention. Much of it will be practical and will involve building a framework that brings within it the framework of law and regulation to keep pace with the new technologies that are now part of the digital and data revolution. In this, the impact of AI will cast a long shadow over our deliberations, as the noble Lord, Lord Knight, the noble Baronesses, Lady Kidron and Lady Young, and others have made clear.
The right reverend Prelate the Bishop of St Albans referred to the benefits of the wide-ranging briefings that we received prior to today’s debate. Let me assure the authors that none of them will go to waste as we move into Committee. As well as dealing with the mundane and the practical, we have to take seriously the advice contained in one briefing, which read:
“At a time of advancing AI-driven surveillance, and when public concerns over measures such as facial recognition technology are heightened, removing oversight and accountability could have serious implications for public trust in policing”.
This warning could apply to almost any sector, service or industry covered by the Bill. Two quotes leap out to me from the excellent Lords Library briefing on the Bill, which has been referred to. One comes from the Information Commissioner, who calls for a regulator that is “trusted, fair and independent”, and the other comes from techUK, which calls for a Bill that will
“help spur competition and innovation in the market, whilst empowering consumers and delivering better outcomes”.
Riding those two horses at once is now the task before us.
My Lords, I join others in welcoming the noble Lord, Lord de Clifford, to this House. I look forward to hearing him in future debates.
This Bill is a large Bill, written in an utterly arcane language which normal people will struggle to understand and follow. Hopefully, the Government will try to write Bills in a better way, otherwise it is hard for people to understand the laws and follow them. I have grave misgivings about some parts of this Bill and I will touch on a couple of these issues, which have already been identified by a number of noble Lords.
George Orwell’s iconic novel Nineteen Eighty-Four, published in 1949, raised the spectre of Big Brother. That nightmare has now been brought to reality by a Conservative Government supposedly rolling back the state. The Government have already undermined the people’s right to protest and to withdraw labour. Now comes snooping and 24/7 surveillance of the bank, building society and other accounts of the sick, disabled, poor, elderly and unfortunate, all without a court order. Over 22.4 million people would be targeted by that surveillance, but the account holders will not be told anything about the frequency and depth of this organised snooping.
In true Orwellian doublespeak, the Government claim that the Bill will
“allow the country to realise new post-Brexit freedoms”.
They link the surveillance to, and are stirring up, people’s fears about benefit fraud, while there is absolutely no surveillance of those receiving public subsidies, those mis-selling financial products, those accused of PPE fraud or even a former Chancellor who abused the tax system. Numerous court judgments have condemned the big accounting firms for selling illegal tax-dodge schemes and robbing the public purse, but despite those judgments no major accounting firm has, under this Government, ever been investigated, fined or prosecuted. None of the accounts of those partners or firms is under surveillance. The Bill is part of a class war: it targets only low-income and middle-income people, while big beasts get government contracts.
Currently, the Department for Work and Pensions can request details of bank accounts and transactions on a case-by-case basis on suspicion of fraudulent activity, but Clause 128 and Schedule 11 give the Government unrestrained powers to snoop. The Government say that the Bill
“would allow regular checks to be carried out on the bank accounts held by benefit claimants to spot increases in their savings which push them over the benefit eligibility threshold, or when people spend more time overseas than the benefit rules allow for. This will help identify fraud”
and
“take action more quickly”.
How prevalent is the benefit fraud that the Government wish to tackle? The Government estimate that, in 2023, they lost £8.3 billion to welfare fraud and errors, 80% of which is attributed to fraud. A government statement issued on 23 November said that, as a result of mass surveillance, benefit fraud would save the public purse
“£600 million over the next five years”.
On 29 November, in a debate in the other place, the Minister mentioned the figure of £500 million and, despite a number of challenges, did not correct that estimate. The Government are hoping that mass snooping will generate savings of £100 million to £120 million a year, but we do not have a breakdown of this saving and do not know how they have arrived at that number. I hope that the number is more reliable than the Government’s estimates of the HS2 costs. To put this into context, the Government are spending nearly £1,200 billion this year and they are introducing snooping to save about £100 million a year.
The snooping of bank accounts suggests that the Government are looking for unusual cash-flow patterns. What that means is that, if anyone gives a lump sum to a loved one for Christmas, a birthday, a holiday or home repairs, and it passes through their bank account, the Government could seize on that as evidence of excess resources and reduce or stop their benefits. Suppose that a poor person pawns some household items for a few pounds and temporarily boosts his or her bank balance. Would that person now be labelled a fraudster and lose benefits? The Government have not looked at the details of what would happen.
Many retirees have a joint bank account with another member of the family or with a friend. Under the Government’s crazy plans, the third party would also be put under surveillance because they happen to have a joint account. Can the Minister explain why people not receiving any social security benefits are to be snooped upon, because they would be caught in this trap?
How will the snoopers distinguish temporary and easily explainable boosts in bank balances from others? My background is that I am an accountant and I have investigated things over the years; I helped the Work and Pensions Committee investigate the collapses of BHS and Carillion. So I hope that the Minister can enlighten me on how all this will be done.
I hope that the Minister can also clarify the scope of the Bill as it applies to recipients of the state pension. The Government have classified it as a benefit, so can the Minister explain why? After all, the amount one gets is determined by the number of years of national insurance contributions. So why is it actually a benefit? The Minister in the other place said:
“I agree, to the extent that levels of fraud in state pensions being currently nearly zero, the power is not needed in that case. However, the Government wish to retain an option should the position change in the future”.—[Official Report, Commons, 29/11/23; col. 912.]
Why do the Government want to snoop on the bank accounts of OAPs when there is hardly any fraud? Do they have some sinister plan to treat the state pension as a means-tested benefit? Perhaps the Minister could confirm or deny that. If he wishes to deny it, can he explain why the Government are targeting retirees? What have they done?
In this House, we have more than our fair share of senior citizens who receive a state pension, and their bank accounts would also be under surveillance. How long before a Government abuse that information to blackmail Members of this House and erode possibilities of scrutinising the Government of the day? It is opening us all up to blackmail, now or in the future.
In the past, the Government assured us that health data would not be sold—but then sold it to corporations, as we heard earlier. How can we trust the Government not to do the same with data collected via snooping on bank accounts? What will they be selling?
The mass surveillance is not subject to any court order. Concerned citizens will not be told, as their right to know will be further eroded by Clause 9. It is for the courts, not Ministers, to decide whether requests for data are vexatious or excessive. Can the Minister provide us with some data on how many requests for information are received by departments each year and what proportion have been declared to be vexatious and excessive by the courts? The Government cannot just say that they are vexatious—I would rather trust the courts.
Clause 9 obstructs government accountability and further erodes the Nolan principles. As a personal example, I fought a five and a half-year battle against the Treasury to learn about the closure of the Bank of Credit and Commerce International in 1991. It was the biggest banking fraud of the 20th century, which has yet to be investigated. I asked the Treasury for some information and was totally fobbed off. I went to the Information Commissioner, who sided with the Treasury. So I went to the courts to get some information, with the possibility that the judges might declare my attempts to learn the truth vexatious and might even impose legal costs on me. Fortunately, that did not happen—I won the case and the Treasury had to release some documents to me.
The information showed that the Conservative Government were covering up money laundering, frauds, the secret funding of al-Qaeda, Saudi intelligence, arms smugglers, murderers and others. The information given to me has never been put on public record by this Government. Can you imagine what will happen now if quests to learn something about banking fraud are simply labelled vexatious and excessive? How will we hold the Government to account? The Bill makes it harder to shine some light on the secret state and I urge the Government to rethink Clause 9.
Finally, I urge the Minister to answer the questions I have raised, so that we can have a better Bill.
My Lords, it is a pleasure to follow the noble Lord, Lord Sikka. I very much share his concerns about the Government prying into the bank accounts of benefit recipients and pensioners. This is a historic moment, for all the wrong reasons, with the Government looking to pry through the private lives of millions of people, with no evidence that it is in any way necessary. The biggest problem with benefits, of course, is the large amount of money that is left unclaimed or unpaid, due to errors made by the Department for Work and Pensions.
I will also pick up the noble Lord’s point about economic crime. I note that this happens to be the week that, in a Frankfurt court, the former global head of tax at Freshfields Bruckhaus Deringer acknowledged in his testimony that he had
“glossed over the fact that my legal advice was used for illegal means”.
This was a man who, until 2019, was earning €1.9 million a year.
I have a direct question for the Minister. The Government have talked a great deal about the DWP and their plans in that area. What does the Bill do to tackle economic crime, given that the head of UK Finance described the UK as
“the fraud capital of the world”
and that we have an enormous problem with enablers, down the road in the City of London, who we know are getting around sanctions from the UK Government and others, swishing so much dirty money through London that it is now known as the “London Laundromat”? What does the Bill do on these issues?
I will tick off some points of agreement and concern from previous speeches. The Minister spoke of
“the highest standards of data protection”.
From what I recollect of the Minister’s speech, there was a surprising lack of the Government’s favourite word, “world-leading”. What does it mean if these data protections are not world-leading?
The Minister also said the Bill was “codesigned all the way”. A number of noble Lords pointed to the 260 amendments on Report at the other place. That really does not look like a codesigning process. The benefit of working across many Bills is that this Bill reminds me—and not in a good way—of the Procurement Bill, where your Lordships’ House saw a similar deluge of government amendments and had to try to disentangle the mess. I fear that we are in the same position with this Bill.
I pick up the speech of the noble Baroness, Lady Kidron —spectacularly excellent, as always—and her points about edtech and the situation with technology and education systems, and the utter impossibility of teachers, nursery nurses or people in similar positions dredging through the fine detail of every app they might want to use to ensure that their charges are protected. That is obviously not a viable situation. There have to be strong, protective general standards, particularly for apps aimed at children. The Government have to be able to guarantee that those nursery nurses and teachers can just pick up something—“It’s approved, it’s okay”—and use it.
I will also pick up the points that the noble Baroness, Lady Kidron, made about the importance of data being available to be used for the public good. She referred to research, but I would like—and I invite NGOs that are interested—to think about community uses. I was recently with the National Association of Local Councils, of which I declare that I am a vice-president, in Shropshire, where we saw parish and town councils doing amazing work to institute climate action. I am talking about small villages where data protection is not really an issue, as everyone knows everything about everybody. But we might think of a suburb of Liverpool or a market town, where people do not have the same personal knowledge of each other but where a council or community group could access data for good reasons. How can we make it possible to use these tools for positive purposes?
Briefly picking up on the points made by the noble Lord, Lord Allan—another of our experts—I echo his stress on the importance of EU equivalency. We have dumped our small businesses, in particular, in the economic mire again and again through the whole process of Brexit. There is a reason why #brexitreality trends regularly. We have also dumped many of our citizens and residents in that situation. We really must not do it again in the technology field.
I have a couple of what I believe to be original points. I want to address specifically Clauses 28 and 30, and I acknowledge here a briefing from Rights and Security International. It notes that that these clauses enable the Government to grant an opt-out to police forces from having to comply with many of the data protection requirements when they are working with the intelligence services. For example, they could grant police immunity from handling personal data unlawfully and reduce people’s right of access to their personal data held by the authorities.
In the Commons, the Minister said these provisions would be “helpful” and “efficient”. I put it to your Lordships’ House that to interfere with rights such as these, at the very least the Government should claim, to have any justification, that they are “proportionate” and “necessary”. That is an area that I suspect my noble friend Lady Jones of Moulsecoomb will pick up in Committee. There are also issues raised by the Ada Lovelace Institute and by other noble Lord, about the oversight of biometric technologies, including live facial recognition systems, emotion detection and the foundation models that underlie apps such as ChatGPT. These already limited legal safeguards are being further undermined by the Bill, at a point when there is general acknowledgement in the community that we should be heading in the opposite direction. I think we all acknowledge that this a fast-moving area, but the Government are already very clearly behind.
There are two more areas that I particularly want to pick up. One is elections. There has only just started to be focus on this. The Bill would allow the Government to tear up long-standing campaign rules with new exemptions. Now we have safeguards against direct marketing. These are being removed and,
“for the purposes of democratic engagement”,
anyone from 14 years and above can be targeted. I feel like warning the Government: my experience with young people is that the more they see of the Government, the less they like them, so they might want to think about what messages they send them. Seriously, I note that the Information Commissioner’s Office said during the public consultation on the Bill—and we can really hear the bureaucratic speak here—
“This is an area in which there are significant potential risks to people if any future policy is not implemented very carefully”.
The discussion of the Bill has reflected how this could put us in a situation where our elections are even more like those in the United States of America, which is of course no recommendation at all with the place of big money in their politics. I note that we really need to link this with the Government’s recent decision to massively increase election spending limits. Put those two things together and I suggest that is a real threat to what limited democracy we already have left in this country.
There is a further area which I am not going to go into in great detail, given the hour and the day, but which I will probably come back to in Committee. There is an extensive briefing, which I am sure many have seen from Understanding Patient Data. It is really important how the Bill comes up with a different definition of identifiable data. In the health sector, it is very common to use pseudonymous information from which key bits are removed, but it is still quite possible to go backwards and identify an individual from their data because they have an extremely rare disease and they live in this area of the country, or something like that.
This new Bill has, instead, more of a subjective test; the definition seems to rely on the judgment of the data controller and what they know. If the Minister has not looked at the briefing from Understanding Patient Data, I really urge him to because there are concerns here and we already have very grave concern in our community about the use of medical data, the possible loss of anonymity, and the reuse of data for commercial research. We are, again, coming to an Americanisation of our health system.
I conclude by saying that we have an enormous amount of work to do here in your Lordships’ House; I am trying not to let my head sink quietly on to the Bench in front of me, but we are going to have a break first, of course. I say to all noble Lords and—echoing the comments earlier—the many members of staff who support us by working so hard and often so late: thank you very much and Merry Christmas all.
My Lords, it is a pleasure to take part on Second Reading; I declare my interests in financial services and technology, in Ecospend Ltd and Boston Ltd. There is a fundamental truth at the heart of our deliberations, both on Second Reading and as we progress to Committee: that is it is our data. There are no great large language models; perhaps it would be more appropriate to call them large data models —maybe then they would be more easily and quickly understood by more people. Ultimately, our data is going into AI for potentially positive and transformational purposes but only if there is consent, understanding, trustworthiness and a real connection between the purpose to which the AI is being put and those of us whose data is being put into the AI.
I am going to focus on four areas: one is data adequacy, which has already, understandably, been heavily mentioned; then AI, smart data and digital ID. I can probably compress everything I was going to say on the first subject by simply asking my noble friend the Minister: how will the Bill assure adequacy between the UK and the EU? It is quite a large Bill— as other noble Lords have commented—yet it still has a number of gaps that I am sure we will all be keen to fully fill in when we return in 2024. As already mentioned, AI is nothing without data, so what checks are being put in place for many of the suggestions throughout the Bill where AI is used to interrogate individuals’ data? Would it not be absolutely appropriate for there to be effective, clear, transparent labelling across all AI uses, not least in the public sector but across all public and private sector uses? Saying this almost feels like going off track from the Bill into AI considerations, but it seems impossible to consider the Bill without seeing how it is inextricably linked to AI and the pro-innovation AI White Paper published earlier this year. Does the Minister not agree? How much line-by-line analysis has been done of the Bill to ensure that there is coherence across the Government’s ambitions for AI and what is currently set out in this Bill?
On smart data, there are clearly extraordinary opportunities but they are not inevitabilities. To consider just one sector, the energy sector, to be able potentially to deploy customers’ data in real time—through their smart meters, for example—with potential to auto-shift in real time to the cheapest tariff, could be extraordinarily positive. But again, that is only if there is an understanding of how the consent mechanisms will work and how each citizen is enabled to understand that it is their data. There are potentially huge opportunities, not least to do something significant about the poverty premium, where all too often those who find themselves with the least are forced to pay the most, often for essential services such as energy. What are the Government doing in terms of looking at additional sectors for smart data deployment? What areas are the state activities? What areas of previous state activity are being considered for the deployment of smart data? What stage is that analysis at?
On digital ID, about which I have spoken a lot over previous years, again there are huge opportunities and possibilities. I welcome what is in the Bill around the potential use of digital ID in property transactions. This could be an extraordinarily positive development. What other areas are being looked at for potential digital ID usage? What stage is that analysis at? Also, is what is set out in the Bill coherent with other government work in other departments on digital ID? It seems that a lot has been done and there have been a number of efforts from various Administrations on digital ID, but we are yet to realise the prize it could bring.
I will ask my noble friend some questions in conclusion. First, how will the introduction of the SRI improve things compared with the data protection officer? Again, how will that impact on issues such as, but not limited to, adequacy? Similarly, linking back to artificial intelligence, a key principle—though not foolproof by any measure and certainly not a silver bullet, but important none the less—is the human in the loop. The Bill is currently some way short of a clear, effective definition and exposition of how meaningful human intervention, human involvement and human oversight will work where autonomous systems are at play. What are the Government’s plans to address that significant gap in the Bill as currently drafted?
I end where I began, with the simple truth that it is our data. Data has been described in various terms, not least as the new oil, but that definition gets us nowhere. It is so much more profound than that. Ultimately it is part of us and, when it is put together in combination, it gets so close to giving such a detailed, personal and almost complete picture of us—ultimately the digital twin, if you will. Are the Government content that the Bill does everything to respect and fully understand the need for everything to be seen as trustworthy, to be understood in terms of it being our data and our decision, and that we decide what data to deploy, for what purpose, to whom and for what time period? It is our data.
My Lords, it is a real privilege to follow the noble Lord, Lord Holmes. I hope that the Government will learn from his wisdom. I congratulate the noble Lord, Lord de Clifford; I am glad that his family was here to witness his powerful contribution.
I support the Government’s laudable aim to make the UK the most innovative society in the world of science and technology. I wish to record my gratitude for the many briefings provided to us by the Library, the 5Rights Foundation, Big Brother Watch, CRISP and Marie Curie, among many other notable organisations and individuals. The Government make sweeping assurances that this legislation is based on their commitment to all citizens enjoying access to a fair, inclusive and trustworthy digital environment. It is grounded in the hopes that an algorithmic system would be designed to protect people from harm and from unsafe, unaccountable surveillance, with public involvement in its development, ensuring adequate safeguards as well as improved skills and information literacy. That really describes the mouthful of different aspects of the Bill.
Every part of our life is determined by some form of digitalisation, not least via our devices; they are an ever-present reminder, if any were required, of the interconnectedness of our existence at home and across the globe. We are living through an exponential rise in social media information alongside the extraordinary growth of technologies’ surveillance capacity. It is sometimes impossible to differentiate truth and reality in the mass of content across multiple platforms, and thus an assurance of public safeguarding within fast-moving technologies may not be achievable quite as easily as the Government suggest. Experts are consistently warning us of yet uncharted harm in the advent of AI-driven technology, causing legitimate concerns for civic society organisations.
So where does an ordinary citizen turn to if they get caught up in some of the Bill’s punitive measures? As has been stated by noble Lords, contradicting progress made in this House, the Bill will provide the Government with yet more unprecedented powers, which will evidently result in the limiting of and infringement on citizens’ rights to privacy; this was detailed powerfully by the noble Lord, Lord Sikka. The Bill is complex and has a broad spectrum of remits that will impact every aspect of our lives: in the home, at work and outside. Time will not permit us all to consider adequately the questions and concerns raised by many well-respected organisations; we in this Parliament are therefore obliged to all our citizens to ensure that our legislation is not immune to proper scrutiny.
Noteworthy parts of the Bill that cause concern include those regarding the safeguarding of children’s well-being. I thank the 5Rights Foundation for its briefing and agree that the Bill’s proposed changes to the UK’s data protection regime risk eroding the high level of privacy that children currently have a right to, making them less safe online. My noble friend Lady Kidron raised these matters thoroughly with her usual expertise; I absolutely agree that children’s safety cannot be designed to maximise economic benefits and add my voice to her call that the Government must keep their promise to bereaved families and ensure that children continue to be given heightened levels of data protection.
This leads me on to the matter of data collection, including how data is stored and shared with external organisations. In this context, there must be absolute commitment from the Government to preserving the integrity of our personal data. Without informed consent, organisations and institutions should not and cannot be allowed to access personal information by assuming consent.
We know that huge datasets are gathered by law enforcement, our NHS, welfare and financial services, alongside local authorities for voter registration purposes. The Data and Marketing Association, representing around 700 companies, including charities and commercial brands, suggests that Clauses 114 and 115, on the new exemption for direct marketing used for democratic engagement, could be open to abuse. While recognising that an open electoral register has been an important resource for business and charities for verification of addresses, it has also been used for direct business marketing, as has been stated.
Any amendments to this aspect of the Bill must not be on the assumption that, if a person does not opt out, she or he is fully cognisant of giving automatic consent for data sharing. I do not accept that this is well known to and understood by the millions of elderly and vulnerable people who do not opt out, or that they do so knowingly. It is our duty to empower all citizens, not just those who can readily access and are confident in this rapidly expanding digital environment. Noting the Minister’s comment on co-designing the Bill with stakeholders, will he give an assurance that partners included advocacy and civil rights organisations?
Clause 9 would give public authorities and other bodies wider scope to refuse subject access requests, making it more difficult for people to find out what information about them is being held and how it is being used. Clause 9 should not have a place in this legislation.
Clause 20 would water down requirements to carry out a proper impact assessment. This means that in many cases, organisations and businesses, including local authorities processing data, will not have to fully consider whether data processing is necessary and proportional. Clause 20 should also be removed from this Bill.
I hope to see us strengthening Clauses 110, 113 and 116 providing greater protection to consent and safeguarding consumers. Whatever the final impact of the legislation, many public and corporate institutions already hold a ginormous amount of digital materials. As someone with years of local authority experience, I can say that safeguarding paper files seems like an alternate universe. All the protocols were written on every manager’s file, and any breaches or failures could have landed any one of us in court. I cannot comprehend all the protocol that may be required to protect individual data under this Bill. How will the Government monitor whether protocols issued as a result of this legislation are actually being adhered to? Who will be held accountable for the anonymity of data holders, given the heightened concern raised by the noble Lord, Lord Knight, and the noble Baroness, Lady Kidron?
When it comes to issues of surveillance of our citizens and the use of retention of biometric data, no matter the reason we must provide the highest standards and maximum safeguards to all those who will determine whether an individual has transgressed rules. The Commons’ deliberation on bank spying on welfare claimants would have caused many vulnerable elders distress, as will continuous police profiling of some sections of our communities for perceived fraudulent behaviour and/or acting against national security interests. We must not feel a false sense of security by relying on any individual Ministers to make arbitrary decisions and add another list to surveillance. Like my noble friend Lady Young, I question how the DWP, bank personnel and police officers will implement a law that falls below parliamentary scrutiny and the highest standards of ethics.
We must acknowledge that the Bill has caused wide- spread concerns. I agree with many who have written to me that we require a nationwide education programme to ensure wider public knowledge, and that consumer groups and charities understand thoroughly how they are likely to be affected by the proposed legislation and, more importantly, the potential impact of the proposed power vis-à-vis the relationship with the DWP and other institutions, if we are to avoid thousands of litigations.
In fact, CRISP’s insightful briefing reminds us to consider genuine, meaningful and trustworthy oversight of the Bill, which aims to simplify the regulatory architecture of UK surveillance oversight but risks creating a vacuum in the regulation of digital surveillance, abandoning clear guidance and standards, complicating oversight governance, and creating vulnerabilities for users of these technologies and for the rights of those subjected to them.
The Bill removes the reporting obligations of the Biometric and Surveillance Camera Commissioner’s role on appropriate surveillance use, as has been stated to Parliament and the public, which endangers visibility and the accountability of police activities. This gives extensive powers in relation to the causes raised by the right reverend Prelate the Bishop of St Albans. The Bill must therefore retain the surveillance camera code of practice, which is essential for public trust.
The Bill gives the Secretary of State broad powers to amend our data protection laws via statutory instrument without adequate scrutiny by Parliament. Many fear that such extensive powers cannot possibly be for the public good, given the records of all Governments, be it with regard to the manipulation of facts or institutional profiling of black and other minoritised communities adversely used in the name of national security. This will simply not be accepted by today's digitalised generation, and the proposition that such information can be held indefinitely without remedy or recourse to justice cannot bode well for our nations.
At a glance, the UK GDPR sets out seven principles, including integrity and accountability. These fundamental rights for citizens cannot be guaranteed under the Bill as it is now. I look forward to all of us making the necessary changes to make better laws for public good.
Finally, I wish all our outstanding staff across the House, noble Lords and their families who are celebrating a loving and joyful Christmas. I wish everyone well.
My Lords, I congratulate the noble Lord, Lord de Clifford, on his excellent maiden speech. I am sure that in this area and others he will be a valuable addition to the House.
One of the advantages of speaking towards the end of the debate is that much of what one could have said has already been said. I particularly enjoyed the speech from my noble friend Lord Knight of Weymouth highlighting the way in which the Bill is consistently behind the curve, always fighting the last war. To some extent, that is inevitable in a field like this, which is developing so rapidly, and I am not convinced that sufficient thought has been given to how developments in digital technology require developments in how it is tackled in legislation.
I think we will have an interesting Committee, in which I will participate as much as I can. The Minister will have a busy spring, with at least two major Bills going through. I hope the Whips have taken account of the number of concerns that have been expressed in this debate, and by external bodies, and that enough time will be allowed in Committee. A particular concern is the large number of amendments added at a late stage in the Commons, which have not had sufficient consideration. It will be our job to look at them in detail.
The proposal to allow the inspection of people’s bank accounts with no due cause is a matter of due concern, which has been mentioned by many people in this debate. I highlight the remarks of UK Finance, the representative body for the banking and financial sector. It says:
“These Department for Work and Pensions proposals have been suggested previously, but they are not part of the economic crime plan 2 or fraud strategy, which are the focus of industry efforts in terms of public-private partnership in tackling economic crime”.
UK Finance goes on to suggest that powers should be more narrowly focused, that they should not leave vulnerable customers disadvantaged—as would appear to be the case in the current drafting—and that further consultation is needed with consumer groups and charities to capture the wider needs of people affected by this proposal. It also suggests that the delivery time for this proposal should be extended even further into the future. For the benefit of the Minister, I shall just interpret that by explaining that what it is saying is, “We have no idea where this proposal came from. It has no part in the overall strategy that was being developed to tackle fraud and we want it pushed off into the indefinite future”—in other words, do not bother. Perhaps the Minister will listen to UK Finance.
I want to focus my remarks particularly on health and health data, which is a particular concern. It is so intimate and personal that it requires additional consideration. It is not just another piece of data; this goes to heart of who we are. The Government said in the context of the King’s Speech that this Bill has been written with industry and for industry. Well, quite. It is possible that some of the changes might result in less work for businesses, including those working in healthcare, but the danger is that the additional flexibility which is being proposed will in fact create additional costs because it is less clear and straightforward, there will be increased risks of disclosure of information that should not be disclosed, and the non-standardised regime will just lead to confusion.
Data regulation can slow down the pace of data sharing, increase people’s concerns about risk, and make research and innovation more difficult. Patients and the public generally quite rightly expect particularly high standards in this area, and I have concerns that this Bill makes the situation worse and that its influence is negative rather than positive. This is a danger, because it affects the public’s attitude to health and health data. If people are worried about the disclosure of their information, this impacts on them seeking and taking advantage of healthcare. That affects all of us, so it is not just a matter of personal concern.
One of the big arguments for the disclosure of health data is that it is available for scientific and developmental research. The need for this is recognised and there are additional safeguards. The UK Health Security Agency can reuse data that is collected by the NHS for the business of disease control, and that is something I am sure we all favour. However, the concept that any data can be reused for scientific purposes has grave dangers, particularly when this Bill fails to define tightly enough what the scientific and developmental research amounts to. The definition of scientific research here appears to apply to commercial as well as non-commercial outfits, whether it is funded publicly or is a private development. This is the sort of concern that we are going to have to tackle in Committee to provide people with the protection that they quite rightly expect.
If we look in more detail at health data, we see that it is protected by the Caldicott principles for health and social care data. It is worth reading the eight principles. The first sets the scene. It says, in the context of social care:
“Every proposed use … of confidential information should be clearly defined, scrutinised and documented, with continuing uses regularly reviewed by an appropriate guardian”.
This Bill is in grave danger of moving beyond that level of protection, which has been agreed and which people expect. People want and expect better regulation of their personal data and more say over what happens to it. This Bill moves us away from that.
It is worth looking in this context at the views of the BMA, which is particularly concerned about health data. It emphasises the fact that the public expect high standards and calls on this House to challenge what it regards as the “problematic provisions” and to seek some reassurance from the Government. I will list what the BMA regards as problematic provisions and why it does not like them: Clause 11, which erodes transparency of information to data subjects; Clauses 32, 35, 143 and 144, which risk eroding regulatory independence and freedom; Clause 1, which risks eroding protections for data by narrowing the definition of “personal data”; Clause 14, which risks eroding trust in AI; Clause 17, which risks eroding the expertise and independence of organisational oversight; and Clauses 20 and 21, which risk eroding organisational data governance. We will need to explore all of these issues in Committee. The hope is that they will get the attention that they deserve.
When it comes to medical data, there is an even stronger case, which the Bill needs to tackle straight on, around people’s genetic information. This is the holy grail of data, which people are desperate to get hold of. It says so much about people, their background and their experiences. We need a super level of protection for genetic data. Again, this is something that needs to be tackled in the Bill.
There are other issues of concern that I could mention—for example, the abolition of the Biometrics Commissioner and Surveillance Camera Commissioner. This is a point of particular concern, raised by a number of bodies. It is quite clear that something is being lost by moving these over to a single commissioner. There is a softer power held by the commissioners, which, to be honest, a single commissioner will not have the time or the bandwidth to deal with.
There is also concern that there needs to be explicit provision in the Bill to enable representative bodies, such as trade unions and commercial organisations, to pursue complaints and issues of concern on behalf of individuals. The issue of direct marketing, particularly of financial services, needs to be addressed.
So there is lots to do on this Bill. I hope the Minister recognises that, at this stage, we are just highlighting issues that need to be looked at in detail, and that time will be provided in Committee to deal with all these issues properly.
My Lords, at this late stage in any debate much of the field is likely to have been covered, but, as someone deeply involved in the crafting, drafting and evolution of the EU GDPR while an MEP in Brussels, I declare a strong vested interest in this subject. I hope that the Minister will not be too negative about the work that we did —much of it was done by Brits in Europe—on producing the GDPR in the first place.
I raised this issue at the recent UK-EU Parliamentary Partnership Assembly and in bilateral discussions with the European Parliament’s civil liberties committee, on which I served for many years, on its recent visit to London. Let me be candid: while the GDPR stands as a significant achievement, it is not without need for enhancement or improvement. The world has undergone a seismic shift since the GDPR’s inception, particularly in the realm of artificial intelligence. Both the UK and the EU need to get better at developing smart legislation. Smart legislation is not only adaptive and forward-looking; it is also flexible enough to evolve alongside emerging trends and challenges.
The importance of such legislation is highlighted by the rapid advancement in various sectors, and particularly in areas such as artificial intelligence—as so well referred to by my noble friend Lord Holmes of Richmond—and how our data is used. These fields are evolving at a pace that traditional legislative processes struggle to match. Such an approach is vital, not only to foster innovation but to ensure that regulations remain relevant and effective in a swiftly changing world, helping to maintain our competitive edge while upholding our core values and standards.
The aspirations of this Bill, which is aimed at modernising and streamlining the UK’s data protection framework while upholding stringent standards, are indeed laudable. I regret that, when my noble friend Lord Kamall was speaking about cookies, I was temporarily out of the Chamber enjoying a culinary cookie for lunch. While there may be further advantages to be unearthed in the depths of this complex legislation, so far, the biggest benefit I have seen is its commitment to removing cookie pop-ups. Above all, we must tread carefully to ensure international compliance, which has been referred to by a number of noble Lords, and steadfastly adhere to the bedrock GDPR principles of lawfulness, fairness, transparency, purpose limitation, data minimisation, accuracy, storage limitation and citizens’ redress.
On a procedural note, following other noble Lords, the Government’s recent flurry of amendments—I think there were 266 in total, including 38 new clauses and two new schedules, a staggering 240 of which were introduced at the 11th hour—places a key duty on our House to meticulously scrutinise the new legislation line by line. I have heard other speakers refer to my friend, the right honourable Member for Haltemprice and Howden, in the other place, who astutely observed that that House has
“in effect delegated large parts of the work on this important Bill to the House of Lords”.—[Official Report, Commons, 29/11/23; col. 888.]
I have to say that that is wonderful because, for those of us who are always arguing that this is the House that does the work, that is an acknowledgement of its skills and powers. It is a most welcome reference.
I wish to draw the House’s attention briefly to three important terms: adequacy, which noble Lords have heard about, equivalence and approximation. Adequacy in data protection primarily comes from the EU’s legal framework. It describes the standard that non-EU countries must meet to allow free flow of personal data from the EU. The European Commission assesses this adequacy, considering domestic laws and international commitments. The UK currently benefits from the EU’s two data adequacy decisions, which, I remind the House, are unilateral. However, we stand on the cusp of a crucial review in 2024, when the Commission will decide the fate of extending data adequacy for another four years and it has the power to withdraw its decision in the meantime if we threaten the basis for it. This Bill must not increase the risk of that happening.
Equivalence in the realm of data protection signifies that different systems or standards, while not mirror images, offer comparable levels of protection. It is about viewing a non-EU country’s data protection laws through a lens that recognises their parity with GDPR in safeguarding personal data. Past EU adequacy decisions have not demanded a carbon copy of laws; rather, they seek an essentially equivalent regulatory landscape.
Approximation refers to aligning the laws of EU member states with each other. In data protection, it could describe efforts to align national laws with GDPR standards. The imperative of maintaining data adequacy with the EU cannot be overstated; in fact, it has been stated by many noble Lords today. It stands as a top priority for UK business and industry, a linchpin in law enforcement co-operation, and a gateway to other vital databases. The economic stakes are monumental for both sides: EU personal data-enabled services exports to the UK were worth approximately £42 billion in 2018, and exports from the UK to the EU were worth £85 billion.
I commend the Government for listening to concerns that I and others have raised about democratic oversight and the independence of the Information Commissioner’s Office. The amendment to Clause 35, removing the proposal for the Secretary of State to veto ICO codes of practice, was welcome. This move has, I am informed, sent reassuring signals to our friends in Brussels. However, a concern still remains regarding the UK’s new ambition for adequacy partnerships with third countries. The Government’s impact assessment lists the United States, Australia, the Republic of Korea, Dubai International Finance Centre, Singapore and Colombia, with future agreements with India, Brazil, Kenya and Indonesia listed as priorities.
Some of these nations have data standards that may not align with those of the EU or in fact offer fewer safeguards than our current system. I urge extreme caution in this area. We do not want to be in the situation where we gain a data partnership with Kenya but jeopardise our total data adequacy with the EU. Fundamentally, this Bill should not weaken data protection rights and safeguards. It should ensure transparency in data use and decision-making, uphold requirements for data processors to consider the rights and interests of affected individuals and, importantly, not stray too far from international regulations.
I urge my noble friend the Minister and others to see that adopting a policy of permanent dynamic alignment with the EU GDPR is important, engaging actively with the EU as a partner, not just implementing new rules blindly. Protecting and strengthening the UK-EU data partnership offers an opportunity for closer co-operation, benefiting businesses, consumers, innovation and law enforcement; and together, we can reach out to others to encourage them to join these truly international standards.
My Lords, I thank the Minister for his introduction to the Bill today and congratulate the noble Lord, Lord de Clifford, on his maiden speech. I think we all very much appreciated his valuable perspective on SMEs having to grapple with the intricacies of data protection. I very much look forward to his contributions—perhaps in Committee, if he feels brave enough.
The Minister will have heard the concerns expressed throughout the House—not a single speaker failed to express concerns about the contents of the Bill. The right reverend Prelate the Bishop of Southwell and Nottingham reminded us that the retention and enhancement of public trust in data use and sharing is of key importance, but so much of the Bill seems almost entirely motivated by the Government’s desire to be divergent from the EU to get some kind of Brexit dividend.
As we have heard from all around the House, the Bill dilutes where it should strengthen the rights of data subjects. We can then all agree on the benefits of data sharing without the risks involved. The Equality and Human Rights Commission is clearly of that view, alongside numerous others, such as the Ada Lovelace Institute and as many as 26 privacy advocacy groups. Even on the Government’s own estimates, the Bill will have a minimal positive impact on compliance costs—in fact, it will simply lead to companies doing business in Europe having to comply with two sets of regulations.
I will be specific. The noble Lord, Lord Davies of Brixton, set out the catalogue, and I will go through a number of areas where I believe those rights are being diluted. The amended and more subjective definition of “personal data” will narrow the scope of what is considered personal data, as the right reverend Prelate the Bishop of St Albans pointed out. Schedule 1 sets out a new annexe to the GDPR, with the types of processing activities that the Government have determined have a recognised legitimate interest and will not require a legitimate interest human rights balancing test to be carried out. Future Secretaries of State can amend or add to this list of recognised legitimate interests through secondary legislation. As a result, as the noble Baroness, Lady Bennett, pointed out, it will become easier for political parties to target children as young as 14 during election campaigns, even though they cannot vote until they are 16 or 18, depending on the jurisdiction.
The Bill will change the threshold for refusing a subject access request, which will widen the grounds on which an organisation could refuse requests. The noble Lord, Lord Sikka, reminded us of the existing difficulties of making those subject access requests. Clause 12, added on Report in the Commons, further tips power away from the individual’s ability to access data.
There are also changes to the automated decision-making provisions under Article 22 of the GDPR—the noble Lord, Lord Holmes, reminded us of the importance of the human in the loop. The Bill replaces Article 22 with articles that reduce human review of automated decision-making. As the noble Lord, Lord Knight, pointed out, Article 22 should in fact be strengthened so that it applies to partly automated processing as well, and it should give rights to people affected by an automated decision, not just those who provide data. This should be the case especially in the workplace. A decision about you may be determined by data about other people whom you may never have met.
The Bill amends the circumstances in which personal datasets can be reused for research purposes. New clarifying guidance would have been sufficient, but for-profit commercial research is now included. As the noble Lords, Lord Knight and Lord Davies, pointed out and as we discussed in debates on the then Online Safety Bill, the Bill does nothing where it really matters: on public interest researcher access.
The Bill moves away from UK GDPR requirements for mandatory data protection officers, and it also removes the requirement for data protection impact assessments. All this simply sets up a potential dual compliance system with less assurance—with what benefit? Under the new Bill, a controller or processor will be exempt from the duty to keep records, unless they are carrying out high-risk processing activities. But how effective will this be? One of the main ways of demonstrating compliance with GDPR is to have a record of processing activities.
There are also changes to the Information Commissioner’s role. We are all concerned about whether the creation of a new board will enable the ICO to maintain its current level of independence for data adequacy purposes. This is so important, as the noble Baroness, Lady Young, and my noble friend Lord McNally pointed out.
As regards intragroup transfers, there is concern from the National Aids Trust that Clause 5, permitting the intragroup transmission of personal health data
“where that is necessary for … administrative purposes”,
could mean that HIV/AIDS status is inadequately protected in workplace settings.
Schedule 5 to the Bill amends Chapter 5 of the UK GDPR to reform the UK’s regime for international transfers, with potential adverse consequences for business. The noble Lord, Lord Kirkhope, reminded us of the dangers of adopting too low standards internationally. This clearly has the potential to provide less protection for data subjects than the current test.
In Clause 17, the Bill removes a key enabler of collective interests, consultation with those affected by data and processing during the data protection risk assessment process, and it fails to provide alternative opportunities. Then there is the removal of the legal obligation to appoint a representative. This risks data breaches not being reported, takes away a channel of communication used by the ICO to facilitate its investigations, and increases the frustration of UK businesses in dealing with overseas companies that come to the UK market underprepared to comply with the UK GDPR.
Given that catalogue, it is hardly surprising that so many noble Lords have raised the issue of data adequacy. If I read out the list of all the noble Lords who have mentioned it, I would probably mention almost every single speaker in this debate. It is clear that the Bill significantly lowers data protection standards in the UK, as compared with the EU. On these Benches, our view is that this will undermine the basis of the UK’s EU data adequacy. The essential equivalence between the UK and the EU regimes has been critical to business continuity following Brexit. The Government’s own impact assessment acknowledges that, as the UK diverges from the EU GDPR, the risk of the EU revoking its adequacy decisions will increase. So I very much hope that the Minister, in response to all the questions he has been asked about data adequacy, has some pretty good answers, because there is certainly a considerable degree of concern around the House about the future of data adequacy.
In addition, there are aspects of the Bill that are just plain wrong. The Government need to deliver in full on their commitments to bereaved families made during the passage of what became the Online Safety Act, regarding access to their children’s data, as we have heard today from across the House, notably from the noble Baroness, Lady Kidron, in insisting that this is extended to all deaths of children. I very much hope that the Minister will harden up on his assurances at the end of the debate.
The noble Lords, Lord Kamall and Lord Vaux, questioned the abolition of the Surveillance Camera Commissioner, and the diminution of the duties relating to biometric data. Society is witnessing an unprecedented acceleration in the capability and reach of surveillance technologies, particularly live facial recognition, and we need the commissioner and Surveillance Camera Code of Practice in place. As the Ada Lovelace Institute says in its report Countermeasures, we need new and more comprehensive legislation on the use of biometrics, and the Equality and Human Rights Commission agrees with that too.
As regards what the noble Lord, Lord Sikka, described as unrestrained financial powers, inserted at Commons Report stage, Sir Stephen Timms MP, chair of the DWP Select Committee, very rightly expressed strong concerns about this, as did many noble Lords today, including the noble Baroness, Lady Young, and the noble Lords, Lord Knight and Lord Fox. These powers are entirely disproportionate and we will be strongly opposing them.
Then we have the new national security certificates and designation notices, which were mentioned by the right reverend Prelate the Bishop of St Albans. These would give the Home Secretary great and unaccountable powers to authorise the police to violate our privacy rights, through the use of national security certificates and designation notices, without challenge. The Government have failed to explain why they believe these clauses are necessary to safeguard national security.
There is a whole series of missed opportunities during the course of the Bill. As the noble Lord, Lord Knight, said in his opening speech, the Bill was an opportunity to create ethical, transparent and safe standards for AI systems. A number of noble Lords across the House, including the noble Lord, Lord Kamall, the noble Baroness, Lady Young, the right reverend Prelate the Bishop of Southwell and Nottingham, and my noble friend Lord McNally, all said that this is a wasted opportunity to create measures adequate to an era of ubiquitous use of data through AI systems. The noble Baroness, Lady Kidron, in particular talked about this in relation to children, generative AI and educational technology. The noble Lord, Lord Holmes, talked of this in the public sector, where it is so important as well.
The EU has just agreed in principle to a new AI Act. We are miles behind the curve. Then, of course, we have the new identification verification framework. The UK has chosen not to allow private sector digital ID systems to be used for access. Perhaps the Government could explain why that is the case.
There are a number of other areas, such as new models of personal data control, which were advocated as long ago as 2017, with the Hall-Pesenti review. Why are the Government not being more imaginative in that sense? There is also the avoidance of creating a new offence of identity theft. That seems to be a great missed opportunity in this Bill.
As the noble Baroness, Lady Kidron, mentioned, there is the question of holding AI system providers to be legally accountable for the generation of child sexual abuse material online by using their datasets. My noble friend Lord McNally and the noble Lord, Lord Kamall, raised the case of ICO v Experian. Why are the Government not taking the opportunity to correct that case?
In the face of the need to do more to protect citizens’ rights, this Bill is a dangerous distraction. It waters down rights, it is a huge risk to data adequacy, it is wrong in many areas and it is a great missed opportunity in many others. We on these Benches will oppose a Bill which appears to have very few friends around the House. We want to amend a great many of the provisions of the Bill and we want to scrutinise many other aspects of it where the amendments came through at a very late stage. I am afraid the Government should expect this Bill to have a pretty rough passage.
My Lords, first I want to thank all those noble Lords who have spoken today, and actually, one noble Baroness who has not: my colleague, the noble Baroness, Lady Jones. I am sure the whole House will want to wish her a safe and speedy recovery.
While I am name-checking, I would also like to join in the general congratulation of the noble Lord, Lord de Clifford, who, as others have observed, made a valuable case on behalf of small businesses and SMEs generally, and also called, in his words, for investment to assist this sector to deal with the challenges of data protection.
The range of concerns raised is a good indication of the complexity of this Bill and the issues which will keep us pretty busy in Committee, and I am sure well beyond. We have been well briefed; a record number of briefings have been dispatched in our direction, and they have been most welcome in making sure that we are on top of the content of this Bill.
At the outset, let me make it clear that while we support the principle of modernising data protection legislation and making it suitable for a rapidly changing technological landscape, one that is fit for purpose, we join with noble Lords like the noble Lord, Lord Kirkhope, who made the case for ensuring that the legislation is relevant. We need to properly scrutinise this, and we understand the need to simplify the rules and make them clearer for all concerned. Most speakers commented on this real need and desire.
However, as others have said, this Bill represents a missed opportunity to grasp the challenges in front of us. It tinkers rather than reforms, it fails to offer a new direction and it fails to capitalise on the positive opportunities the use of data affords, including making data work for the wider social good. I thought the noble Lord, Lord Holmes, made a good case in saying it is our data and therefore needs to be treated with respect. I do not think this Bill does that.
The Bill fails to build on the important safeguards and protections that have been hard won by others in other fields of legislation covering the digital world, in particular, about the use of personal data that we want to see upheld and strengthened. The noble Baroness, Lady Kidron, made an inspired speech, pleading with us to hold the Government’s feet to the fire on this issue and others.
The Bill also fails to provide the simplicity and certainty that businesses desire, given that it is vital that we retain our data adequacy status with the EU. Therefore, businesses will find themselves navigating two similar but, as others have said, divergent sets of rules, a point well made by the right reverend Prelate the Bishop of St Albans and the noble Lords, Lord Vaux and Lord Kirkhope. In short, it feels like a temporary holding position rather than a blueprint for reform, and I suspect that, all too soon, we will be back here with a new Bill—perhaps a data protection (No. 3) Bill—which will address the more profound issues at the frontier of data use.
Before that, I must take over the points made by my noble friend Lord Knight, who opened the debate for us. It is an affront to our parliamentary system that the Government chose to table 266 amendments on the last available day before Report in the Commons—about 150 pages of amendments to consider in a single debate. The marvellous notes that accompany the Bill had to be expanded by something like a fifth to take account of all these amendments; it has grown over time. Clearly, our Commons colleagues had no way of being able to scrutinise these amendments with any degree of effectiveness, and David Davis made the point that it is down to us now to make sure that that job is well done.
I agree that some of the amendments are technical, but others are very significant, so can the Minister explain why it was felt necessary to rush them through without debate? For example, the new Schedule 1 will grant the Secretary of State the power to require banks, or other financial institutions, to provide the personal data for anyone in receipt of benefits. These include state pensions and universal credit, but they also include other benefits—working tax credit, child tax credit, child benefit, pension credit, jobseeker’s allowance and personal independence payments. That is a long list; we think that it probably covers some 40% of the population. What is the Government’s real need here?
Yesterday, we had a consultation session with the Minister. I asked where the proposals came from, and he was very honest that they were included in a DWP paper on fraud detection some two years ago. Why is it that the amendments were put into the Bill so late in the day, when they have been around and accessible to the Government for two years? Why has there not been any effective consultation on this? Nobody was asked whether they wanted these changes made, and it seems to me that the Government have acted in an entirely high-handed way.
Most of the population will fall into one or the other of the categories, as my noble friend Lady Young and the noble Lord, Lord Vaux, made clear. Some, such as the noble Lord, think that they might be exempted—but, having listened to my list, he may think otherwise. The criteria for these data searches are not clarified in the Bill and have no legislative limit. Why is that the case?
As Mel Stride and the DWP officials made clear when giving evidence to the Work and Pensions Select Committee recently, this is not about accessing individual bank accounts directly where fraud is suspected, it is about asking for bulk data from financial organisations. How will the Government be able to guarantee data security with bulk searches? When were the Government planning to tell the citizens of this country that they were planning to take this new set of powers to look into their accounts? I warn the Minister that I do not think it will go down very well, when the Government fully explain this.
Meanwhile, the banking sector has also raised concerns about the proposals, which it describes as too broad and liable to put vulnerable customers at a disadvantage. The ICO also questions the proportionality of the measure. Let me make our position clear on this: Labour is unreservedly committed to tackling fraud. We will pursue the fraudsters, conmen and claimants who try to take money from the public purse fraudulently or illegally. This includes those involved in tax fraud or dodgy PPE contracts. As our shadow Minister Chris Bryant made clear in the Commons:
“I back 100% any attempt to tackle fraud in the system, and … will work with the Government to get the legislation right, but this is not the way to do it”.—[Official Report, Commons, 29/11/23; col. 887.]
I hope that the Minister can confirm that he will work with stakeholders, banks and ourselves to find a better way to focus on tackling fraud in all its guises.
Another aspect of the Bill that was revealed at a late date are the rules governing democratic engagement, to which a number of Peers have referred today. The Bill extends the opportunities for direct mail marketing for charitable or political purposes. It also allows the Secretary of State to change the rules for the purposes of democratic engagement. It has now become clear that this will allow the Government to switch off the direct marketing rules in the run-up to an election. Currently, parties are not allowed to send emails, texts, voicemails and so on to individuals without their specific consent. We are concerned that changing this rule could transform UK elections. These powers were opposed in the public consultation on the Bill; this is not what the public want. We have to wonder at the motives of the Government in trying to change these rules at such a late stage and with the minimum of scrutiny. This is an issue to which we will return in Committee; I hope that the Minister can come up with a better justification than his colleagues in the Commons were able to.
I turn to other important aspects of the Bill. A number of noble Lords gave examples of how personal rights to information and data protection, which were previously in the GDPR and the Data Protection Act 2018, have been watered down or compromised. For example, subject access requests have been diluted by allowing companies to refuse such requests on the grounds of being excessive or vexatious—terms that, by their very nature, are hard to define—or by allowing the Secretary of State to define who has a recognised, legitimate interest for processing personal data. Similarly, there is no definition in the Bill of what constitutes high-risk processing —risking uncertainty and instability for businesses and the potential misuse of personal data. We will want to explore these definitions in more detail.
A number of noble Lords quite rightly raised the widespread fear of machines making fundamental decisions about our lives with no recourse to a human being to moderate the decision. The impact of this can be felt more widely than an individual data subject—it can impact on a wider group of citizens as decisions are made, for example, on policing priorities, healthcare and education. This can also have a hugely significant impact in the workplace. Obviously, algorithms and data analysis can bring huge benefits to the workplace, cutting out mundane tasks and ensuring greater job satisfaction. But we also need to ensure that workers and their representatives know what data is being collected on them and have an opportunity for human contact, review and redress when an algorithmic system is used to make a decision. For example, we need to avoid a repeat of the experience of the Just Eat couriers who were unfairly sacked by a computer. We will want to explore how the rights of individuals, groups of citizens and workers can better be protected from unfair or biased automated decisions.
The noble Baroness, Lady Kidron, and others have argued the case for new powers needed to give coroners the right to access information held by tech companies on children’s data where there is a suspicion that the online world contributed to their death and demise. This is a huge and tragic issue that the Government have sadly ducked, although the promise to listen that I heard from the Minister was very welcome. We shall ensure that we keep him to that commitment.
Despite all the promises made, however, the Government have broken the trust of bereaved parents who were expecting this issue to be resolved in the Bill. Instead, the amendment addresses only cases where a child has taken their own life. We will do what we can in this Bill to make sure that the commitments made in the Online Safety Act are fully honoured.
On a separate but important point, Clause 2 allows companies to exploit children’s data for commercial purposes. We believe that without further safeguards, children’s rights will be put very much at risk as companies collect information on where they live, what they buy, how they travel and what they study. We will seek to firm up those children’s rights as the Bill goes forward.
On cookie pop-ups, it is widely accepted that the current system is not working, as everyone ignores them and they have become an irritant. But they were there for a purpose—to ensure that the public were informed of the data being kept on them, so we do not believe that simply removing them is the answer. Similarly with nuisance calls, we want to ensure that the new rules are workable by clarifying the responsibilities of telecoms companies.
As I said at the outset, we regard the Bill as a disappointment that fails to harness the huge opportunities that data affords and to build in the appropriate safeguards. My noble friend Lord Knight put his finger on it well, at the front of the debate, when he said that we need a data protection Bill, but not this Bill.
The Government’s answer to a lack of clarity in so many areas of the Bill is to build in huge, sweeping, Henry VIII powers. When reviewing the legislation recently, we managed to count more than 40 proposed statutory instruments. That is an immense amount of power in the hands of the Secretary of State. We do not believe that this is the right way to legislate on a Bill that is so fundamental to people’s lives and the future of our economy. We want to bring these powers back into play so that they have the appropriate level of parliamentary scrutiny.
With this in mind, and taking into account all the concerns raised today, we look forward to a long and fruitful exchange with the Government over the coming months. This will be a Bill that challenges the Government.
My Lords, I sincerely thank all of today’s speakers for their powerful and learned contributions to a fascinating and productive debate. I very much welcome the engagement in this legislation that has been shown from across the House and such a clear setting out, at this early stage, of the important issues and caveats.
As I said, the Bill reflects the extensive process of consultation that the Government have undertaken, with almost 3,000 responses to the document Data: A New Direction, and the support it enjoys from both the ICO and industry groups. The debate in which we have engaged is a demonstration of noble Lords’ desire to ensure that our data protection regime evolves and works more effectively, while maintaining the highest standards of data protection for all.
I will respond to as many of the questions and points raised as I can. I hope noble Lords will forgive me if, in the interests of time and clarity, I do not name every noble Lord who spoke to every issue. A number of noble Lords expressed the wish that the Government remain open to any and all conversations. Should I inadvertently fail to address any problem satisfactorily, I affirm that I am very willing to engage with all noble Lords throughout the Bill’s passage, recognising its importance and, as the noble Lord, Lord Bassam, said, the opportunity it presents to do great good.
Many noble Lords raised concerns that the Bill does not go far enough to protect personal data rights. This is certainly not our intent. The fundamental data protection principles set out in the UK GDPR—as my noble friend Lord Kirkhope pointed out, they include lawfulness, fairness, transparency, purpose limitation, data minimisation, accuracy, storage limitation, security and accountability—remain at the heart of the UK’s data protection regime. Certain kinds of data, such as health data, remain special categories to which extra protections rightly apply. Changes such as requiring a senior responsible individual, rather than a data protection officer, mean that organisations still need to be accountable for how they process personal data but will have more flexibility about how they manage the data protection risks within their organisations.
On other specific points raised on the data protection framework, I agree that the right of access is key to ensuring transparency in data processing. The proposals do not restrict the right of access for reasonable requests for information and keep reasonable requests free of charge. On the creation of the new recognised legitimate interests lawful grounds, evidence from our consultation indicated that some organisations worried about getting the balancing test wrong, while others said that the need to document the outcome of their assessment could slow down important processing activities.
To promote responsible data sharing in relation to a limited number of public interest tasks, the Bill acknowledges the importance of these activities, which include safeguarding, crime prevention and national security, responding to emergencies and democratic engagement, but data controllers should not be required to do a case-by-case balancing test.
On cookies, the Bill will allow the Secretary of State to remove the need for data controllers to seek consent for other purposes in future, when the appropriate technologies to do so are readily available. The aim is to offer the user a clear, meaningful choice that can be made once and respected throughout their use of the internet. However, before any such powers are used, we will consult further to make sure that people are more effectively enabled to use different technology to set their online preferences.
On democratic engagement, extending the exemption allows a limited number of individuals, such as elected representatives and referendum campaigners, to process political opinions data without consent where this is necessary for their political activities. In a healthy democracy, it is not just registered political parties that may need to process political opinions data, and these amendments reflect that reality. This amendment does not remove existing rights. If people do not want their data processed for these purposes, they can ask the controller to stop doing so at any time. Before laying any regulations under this clause, the Government would need to consult the Information Commissioner and other interested parties, as well as gaining parliamentary approval.
I turn now to concerns raised by many about the independence of the regulator, the Information Commissioner. The ICO remains an independent regulator, accountable to Parliament, not the Government, in its delivery of data protection regulation. The Bill ensures it has the powers it needs to remain the guardian of people’s personal data. It can and does produce guidance on what it deems necessary. The Government welcome this and will work closely with it ahead of and throughout the implementation of this legislation.
New powers will also help to ensure that the Information Commissioner is able to access the evidence he needs to inform investigations and has the time needed to discover and respond to representations. This will result in more informed investigations and better outcomes. The commissioner will be able to require individuals to attend interviews only if he suspects that an organisation has failed to comply with or has committed an offence under data protection legislation. This power is based on existing comparable powers for the Financial Conduct Authority and the Competition and Markets Authority. A person is not required to answer a question if it would breach legal professional privilege or reveal evidence of an offence.
As the noble Lord, Lord Clement-Jones, pointed out, EU adequacy was mentioned by almost everybody, and concerns were raised that the Bill would impact our adequacy agreement with the EU. The Government believe that our reforms are compatible with maintaining our data adequacy decisions from the EU. While the Bill removes the more prescriptive elements of the GDPR, the UK will maintain its high standards of data protection and continue to have one of the closest regimes to the EU in the world after our reform. The test for EU adequacy set out by the Court of Justice of the European Union in the cases relating to UK adequacy decisions requires essential equivalence to the level of protection under the GDPR. It does not require a third country to have exactly the same rules as the EU in order to be considered inadequate. Indeed, 14 countries have EU adequacy, including Japan, New Zealand and Canada. All of these nations pursue independent and often more divergent approaches to data protection.
Regarding our national security practices, in 2020 and 2021, the European Commission carried out a thorough assessment of the UK’s legislation and regulatory framework for personal data, including access by public authorities for national security purposes. It assessed that the UK provides an adequate level of data protection. We maintain an ongoing dialogue with the EU and have a positive, constructive relationship. We will continue to engage regularly with the EU to ensure our reforms are understood.
A great many noble Lords rightly commented on AI regulation, or the lack of it, in the Bill. Existing data protection legislation—the UK GDPR and the Data Protection Act 2018—regulate the development of AI systems and other technologies to the extent that there is personal data involved. This means that the ICO will continue to play an important role in applying the AI principles as they relate to matters of privacy and data protection. The Government’s view is that it would not be effective to regulate the use of AI in this context solely through the lens of data protection.
Article 22 of the UK GDPR is currently the primary piece of UK law setting out the requirements related to automated decision-making, and this Bill sets out the rights that data subjects have to be informed about significant decisions that are taken about them through solely automated means, to seek human review of those decisions and to have them corrected. This type of activity is, of course, increasingly AI-driven, and so it is important to align these reforms with the UK’s wider approach to AI governance that has been published in the White Paper developed by the Office for Artificial Intelligence. This includes ensuring terms such as “meaningful human involvement” remain up to date and relevant, and the Bill includes regulation-making powers to that effect. The White Paper on the regulation of AI commits to a principles-based approach that supports innovation, and we are considering how the framework will apply to the various actors in the AI development and deployment life cycle, with a particular focus on foundation models. We are analysing the views we heard during the White Paper consultation. We will publish a response imminently, and we do not want to get ahead of that process at this point.
I turn to the protection of children. Once again, I thank noble Lords across the House for their powerful comments on the importance of protecting children’s data, including in particular the noble Baroness, Lady Kidron. On the very serious issue of data preservation orders, the Government continue to make it clear—both in public, at the Dispatch Box, and in private discussions—that we are firmly on the side of the bereaved parents. We consider that we have acted in good faith, and we all want the same outcomes for these families struck by tragedy. We are focused on ensuring that no parent is put through the same ordeal as these families in the future.
I recognise the need to give families the answers they require and to ensure there is no gap in the law. Giving families the answers they need remains the Government’s motivation for the amendment in the other place; it is the reason we will ensure that the amendment is comprehensive and is viewed as such by the families. I reassure the House that the Government have heard and understand the concerns raised on this issue, and that is why the Secretary of State, along with Justice Ministers, will work with noble Lords ahead of Committee and carefully listen to their arguments on potential amendments.
I also hear the concerns of the right reverend Prelate the Bishop of St Albans, the noble Lord, Lord Vaux, and the noble Baroness, Lady Young, on surveillance, police powers and police access to data. Abolishing the Surveillance Camera Commissioner will not reduce data protection. The role overlaps with other oversight bodies, which is inefficient and confusing for police and the public. The Bill addresses the duplication, which means that the ICO will continue to regulate data processing across all sectors, including policing. The aim is to improve effective independent oversight, which is key to public confidence. Simplification through consolidation improves consistency and guidance on oversight, makes the most of the available expertise, improves organisational resilience, and ends confusing and inefficient duplication.
The Government also have a responsibility to safeguard national security. The reports into events such as the Manchester Arena and Fishmongers’ Hall terrorist incidents have clearly noted that better joined-up working between the intelligence services and law enforcement supports that responsibility. This is why the Bill creates the power for designation notices to be issued, enabling joint controllerships between the intelligence services and law enforcement. The Secretary of State must consider the processing contained in the notice to be required for the purpose of safeguarding national security to grant it. This mirrors the high threshold for interference with the right to privacy under Article 8 of the Human Rights Act, which requires that such interference be in accordance with the law and necessary in a democratic society.
Concerns were raised by, among others, the noble Baronesses, Lady Young and Lady Bennett, and the noble Lords, Lord Sikka and Lord Bassam, on the proportionality of the measure helping the Government to tackle both fraud and error. Despite taking positive steps to reduce these losses, the DWP remains reliant on powers derived from legislation that is in part over 20 years old. The DWP published the fraud plan in May 2022. It set out clearly a number of new powers that it would seek to secure when parliamentary time allowed. Tackling fraud and error in the DWP is a priority for the Government but parliamentary time is tight. In the time available, the DWP has prioritised our key third-party data-gathering measure which will help to tackle one of the largest causes of fraud and error in the welfare system. We remain committed to delivering all the legislation outlined in the DWP’s fraud plan when parliamentary time allows.
To develop and test these new proposals, the DWP has been working closely with the industry, which recognises the importance of modernising and strengthening these powers to enable us to better detect fraud and error in the benefit system. This includes collaboration on the practical design, implementation and delivery of this measure, including establishing a working group with banks and the financial industry. The DWP has also regularly engaged with UK finance as well as individual banks, building societies and fintechs during the development of this measure, and continues to do so. It is of course important that where personal data is involved there are appropriate checks and balances. Organisations have a right to appeal against the requirement to comply with a data notice issued by the DWP.
Through our appeal process, the Government would first seek to resolve all disputes by DWP internal review. If this failed, the appeal would be referred to the First-tier Tax Tribunal, as currently is used in similar circumstances by HMRC. The third-party data-gathering powers that the DWP is taking are only broad to the extent that this ensures that they can be future-proofed. This is because the nature of fraud has changed significantly in recent years and continues to change significantly. The current powers that the DWP has are not sufficient to tackle the new kinds of fraud that we are now seeing in the welfare system. We are including all benefits to ensure that benefits such as state pension retain low rates of fraud. The DWP will of course want to focus this measure on addressing areas with a significant fraud or error challenge. The DWP has set out in its fraud plan how it plans to focus the new powers, which in the first instance will be on fraud in universal credit.
I thank noble Lords, particularly the noble Lord, Lord Vaux, for the attention paid to the department’s impact assessment, which sets out the details of this measure and all the others in the Bill. As he notes, it is substantive and thorough and was found to be such by the Regulatory Policy Committee, which gave it a green rating.
I hope that I have responded to most of the points raised by noble Lords today. I look forward to continuing to discuss these and other items raised.
I would like some clarification. The Minister in the other place said:
“I agree, to the extent that levels of fraud in state pensions being currently nearly zero, the power is not needed in that case. However, the Government wish to retain an option should the position change in the future”.—[Official Report, Commons, 29/11/23; col. 912.]
Can the noble Viscount explain why the Government still want to focus on recipients of state pension given that there is virtually no fraud? That is about 12.6 million people, so why?
Although proportionately fraud in the state pension is very low, it is still there. That will not be the initial focus, but the purpose is to future-proof the legislation rather than to have to keep coming back to your Lordships’ House.
Let me once again thank all noble Lords for their contributions and engagement. I look forward to further and more detailed debates on these matters and more besides in Committee. I recognise that there are strong views and it is a wide-ranging Bill, so there will be a lot of meat in our sandwich.
I congratulate the noble Lord, Lord de Clifford, on his perfectly judged maiden speech. I thoroughly enjoyed his description of his background and his valuable contributions on the Bill, and I welcome him to this House.
Finally, on a lighter note, I take this opportunity to wish all noble Lords—both those who have spoken in this debate and others—a very happy Christmas and a productive new year, during which I very much look forward to working with them on the Bill.
Bill read a second time.
Commitment and Order of Consideration Motion
Moved by
That the bill be committed to a Grand Committee, and that it be an instruction to the Grand Committee that they consider the bill in the following order:
Clauses 1 to 5, Schedule 1, Clause 6, Schedule 2, Clauses 7 to 14, Schedule 3, Clauses 15 to 24, Schedule 4, Clause 25, Schedules 5 to 7, Clauses 26 to 46, Schedule 8, Clauses 47 to 51, Schedule 9, Clauses 52 to 117, Schedule 10, Clauses 118 to 128, Schedule 11, Clauses 129 to 137, Schedule 12, Clause 138, Schedule 13, Clauses 139 to 142, Schedule 14, Clause 143, Schedule 15, Clauses 144 to 157, Title.
Motion agreed.