The Committee consisted of the following Members:
Chairs: Wera Hobhouse, † Karl Turner
† Anderson, Callum (Buckingham and Bletchley) (Lab)
† Aquarone, Steff (North Norfolk) (LD)
† Beales, Danny (Uxbridge and South Ruislip) (Lab)
† Bryant, Chris (Minister for Data Protection and Telecoms)
† Collins, Victoria (Harpenden and Berkhamsted) (LD)
† Dearden, Kate (Halifax) (Lab/Co-op)
Entwistle, Kirith (Bolton North East) (Lab)
† Fortune, Peter (Bromley and Biggin Hill) (Con)
† Josan, Gurinder Singh (Smethwick) (Lab)
† Juss, Warinder (Wolverhampton West) (Lab)
† Kumar, Sonia (Dudley) (Lab)
† Macdonald, Alice (Norwich North) (Lab/Co-op)
† McIntyre, Alex (Gloucester) (Lab)
† Obese-Jecty, Ben (Huntingdon) (Con)
† Pearce, Jon (High Peak) (Lab)
† Robertson, Joe (Isle of Wight East) (Con)
† Spencer, Dr Ben (Runnymede and Weybridge) (Con)
David Weir, Kevin Candy, Sanjana Balakrishnan, Committee Clerks
† attended the Committee
Public Bill Committee
Tuesday 11 March 2025
(Morning)
[Karl Turner in the Chair]
Data (Use and Access) Bill [Lords]
Clause 66
The 2018 Act and the UK GDPR
Question proposed, That the clause stand part of the Bill.
It may assist the Committee to know that, when there is no great controversy about a clause and all I am doing is reminding people what is in the clause and the explanatory notes, I will move that the clause stand part formally. We are now starting to discuss part 5 of the Bill, which relates to GDPR. This clause is completely and utterly self-explanatory.
It is a pleasure to serve under your chairmanship again, Mr Turner. This part of the Bill relates to GDPR. Although there are some amendments that we will debate later, it makes sensible and long-overdue clarifications with regard to GDPR and the interaction with data protection, particularly through some updates to the use of secondary data for scientific research, which I suspect we will discuss when we come to the next clause.
Question put and agreed to.
Clause 66 accordingly ordered to stand part of the Bill.
Clause 67
Meaning of research and statistical purposes
I beg to move amendment 13, in clause 67, page 75, line 26, leave out
“and that is conducted in the public interest”.
This amendment removes words from new paragraph 2 of Article 4 of the UK GDPR (meaning of processing for the purposes of scientific research). The words were inserted at Report stage in the Lords.
The UK has a proud tradition of innovative research. Our researchers are at the forefront in fields from quantum computing to medicine—for example, leading on pioneering vaccine development during the covid-19 pandemic. Scientific research can bring life-changing benefits to our society and economy. Given that importance, the UK data protection framework contains certain accommodations for processing personal data for research purposes while maintaining key safeguards. The clause, as proposed by their lordships, would require an up-front public interest test for processing to be considered as scientific research and eligible for these accommodations. We disagree with that.
We agree with wanting to avoid misuse of the term “scientific research”, but as the Royal Society has said clearly, the reasonableness test that is now in the Bill provides adequate protection against that. The new test, provided by the Lords, would also be at odds with the internationally recognised Frascati definition, which does not mention an up-front public interest test.
Such a test would be a new burden on many researchers. The Royal Society is concerned that that could have a negative effect on the conduct of research. The Government agree and are especially concerned about the impact on basic and curiosity driven research. The public benefits from such research may not be known at the time, yet with hindsight may be overwhelming. Many of the great discoveries in scientific history, from penicillin to the electron, originated from research that had no anticipated public interest benefit. No one could know how valuable they would be to us now. Even the mRNA-based vaccines that saved millions of lives during the covid-19 pandemic drew on curiosity-driven research that for years had had no practical applications.
That is research that we should be supporting, not stifling. An up-front public interest test could have a chilling effect on it, which is not in the long-term public interest. Our amendment would therefore remove the public interest test inserted by their lordships.
I listened carefully to the Minister’s remarks on the Government’s position on the public interest test, which was extensively debated in the other place. I have a great deal of sympathy with what their lordships were trying to achieve in parts of that amendment. We are talking about the secondary use of data for scientific research and, as I understand it, the public interest test is used in other types of scientific research tests—for example, with public health data. Therefore this is not something which is completely alien to the concept of defining scientific research. However, I do concede there is an issue with how one can define scientific research.
I think what people are nervous about—particularly in some of the conversations I have around this new definition of scientific research for use for secondary data processing—is this being hijacked by artificial intelligence data companies as a way of getting around some of the challenges with using big datasets for which they do not necessarily have licensing arrangements. I know we are going to come on to that later in the Bill. I would be grateful if the Minister could explain his thoughts and considerations about removing this public interest test. How would that interact with some of the concerns about AI and data use?
From a personal perspective, my concept of scientific research has an altruistic component to it. At least with the scientific research with which I was involved, the whole point was that it was information that is shared freely. Clearly there are challenges and considerations on which one needs to focus when it comes to research being done for commercial benefit. I know part of the debate in the other place was about commercial versus non-commercial research. It is important that scientific research can be commercial. In fact, it must be. However, I appreciate that there is a sentiment that their lordships were putting forward as part of the test.
I would be grateful if the Minister could address, first, some of the concerns around AI companies and secondly how we can capture the essence of scientific research so that when people are do secondary data analysis it is done properly. When it comes to the audit component, a lot of this data analysis is people marking their own homework. Some of the data is not going through human rights committees and being scrutinised in that sense. What assurances can the Minister give that the process will operate properly and that research that is scientific will be legitimately so once these clauses are passed?
First, the hon. Gentleman is right that there are other areas where there is a public interest test. In those other areas it is actually very narrow, however. The public interest test is currently only applied to research using special category data under schedule 1 to the Data Protection Act 2018 and public health data. It is quite specific. This is why we are nervous about extending the test across the whole of data protection. It is appropriate, we think, to have that extra protection for those particularly sensitive areas, but we think it would be disproportionate to make all researchers meet that standard, regardless of the type of data they are using.
The hon. Gentleman mentions AI companies, and the matter was of course raised in the Lords. It is worth reminding the Committee that the clause narrows rather than expands the definition of scientific research. As the provisions of the Bill make clear, web scrapers seeking to reuse personal data for purposes such as training AI models must have a lawful basis. Before even considering the meaning of scientific research, a web scraper would need to pass the balancing test to use the legitimate interest ground. The Information Commissioner’s Office outcomes report, published last year, emphasised that:
“Web scraping for generative AI training is a high-risk, invisible processing activity. Where insufficient transparency measures contribute to people being unable to exercise their rights, generative AI developers are likely to struggle to pass the balancing test.”
In other words, AI companies should not be doing precisely that which he is concerned they would.
The Minister says that the changes in the clause narrow the scientific test. That certainly was not my understanding and I would be grateful if he could clarify. Currently, under scientific research, only three criteria can be used under GDPR: statistical purposes, archiving, and genealogical-type purposes. The clause expands secondary data use under GDPR quite substantially.
If an area of research does not currently count as scientific research, it will not do so under the Bill. Researchers will benefit from having a clearer definition in legislation that reflects the language buried in the recitals and the current ICO guidance. We believe that improving clarity will reduce uncertainty, the risk of misinterpretations and misapplication of the law. The Government will of course monitor the reform’s impact and how researchers navigate the data protection framework. We think that will help us better understand if further changes are required.
The fundamental point is that their lordships inserted a public interest line that we are seeking to take out. They inserted it because they believed it would mean that no data would be used in a way that did not lead to eventual good outcomes. The truth is that when someone starts a piece of research based on curiosity they might have no idea what the final outcome will be. That is an important part of how most researchers work, whether in a commercial or non-commercial setting. That is why the Royal Society has unambiguously stated that if we retain their lordships’ public interest provision, it will have a chilling effect on research in the UK. That is why I commend the amendment to the Committee.
The matter turns on how we define public interest. I appreciate the concerns about the clause. Being a scientific purist, I could argue that even blue-sky research could be justified as in the public interest, provided that we all benefit from the fruits of that knowledge when it is shared. Notwithstanding that point, I will go back to the Minister’s comment. I understand that the point of the clause is to expand the definition of scientific research from the currently narrow criteria under GDPR, so there will be research that is currently not scientific research that will be permitted as a consequence of the clause.
I will try one more time to persuade the hon. Member, but I will be repeating myself. The Government are not expanding the meaning of scientific research. The Bill’s definition is completely in line with present ICO guidance, which states that commercial organisations can also carry out scientific research. Scientific research conducted by commercial organisations can have a life-changing impact, as many hon. Members have noted, but the definition does not cover any commercial activity: it only includes activities that could reasonably be viewed as scientific research.
Amendment 13 agreed to.
Clause 67, as amended, ordered to stand part of the Bill.
Clauses 68 and 69 ordered to stand part of the Bill.
Clause 70
Lawfulness of processing
I beg to move amendment 49, in clause 70, page 78, leave out lines 15 to 19.
This amendment would remove subsections (2)(b) and (c) from the Bill which would create a new lawful ground for processing personal data by way of inserting a new Article into the UK GDPR.
With this it will be convenient to discuss the following:
Amendment 53, in clause 70, page 78, line 17, after “interest” insert “, excluding personal health data”.
This amendment would exclude personal health data from being a recognised legitimate interest.
Amendment 50, in clause 70, leave out from line 24 on page 78 to line 32 on page 79.
This amendment omits subsection (4), (5) and (6) which make amendments to UK GDPR to define certain data processing activities as “recognised legitimate interests”.
Clause 70 introduces “recognised legitimate interests” as a new basis for processing personal data. Although the Liberal Democrats recognise the potential of the benefits of data, we have serious concerns that clause 70, as it stands, grants excessive and unchecked power to the Secretary of State, risking the erosion of fundamental rights. Our key message is that trust and innovation go hand in hand. For us to advance and share the benefits of innovation, we must foster trust.
In the name of constructive opposition, I will talk about the several amendments that we have tabled—amendments 49, 50, 53, new clause 5 and new clause 22, which was also tabled by colleagues in the other place. These amendments are crucial to ensure vital safeguards, ensure proper parliamentary oversight and demand greater transparency. They respond directly to concerns raised on Second Reading. Moreover, they echo the concerns of new civil society organisations, including—
Order. The hon. Lady can only speak to amendments 49, 53 and 50 at this point.
In the interest of time, I would like to talk about all the amendments together.
You cannot talk about them all at once because of how they are grouped.
I shall talk about amendments 49, 53 and 50, which would remove the core of clause 70 that allows the direct establishment of “recognised legitimate interests”. Clause 70 risks bypassing essential parliamentary scrutiny, a point underscored by the Delegated Powers and Regulatory Reform Committee and the Constitution Committee, which expressed concerns about the lack of parliamentary oversight. The amendments are essential to ensure that the Data (Use and Access) Bill fosters innovation responsibly without sacrificing fundamental data protection. I urge Government to look over the proposals, which would help improve the adoption and inclusion of growing technology and its benefits. I would also welcome any proposals to improve this scrutiny when we reach Report.
The new lawful ground of recognised legitimate interest in clause 70 and schedule 4 have been designed to give organisations greater confidence when processing personal data for important public interest objectives, such as preventing crime, safeguarding vulnerable individuals and protecting national security. They also ensure that non-public bodies can share information with public authorities where it is necessary and proportionate to do so without having to conduct a complex balancing test in situations where timely action is essential.
Amendment 49 would remove that provision, making it harder for organisations to share data confidently and swiftly for those public interest purposes. I recognise that the hon. Lady might be concerned that the removal of the need to do a detailed legitimate interest balancing test in this narrow set of circumstances could reduce protections, but strong safeguards remain in place, as was recognised in the House of Lords, which did not approve such an amendment. Any processing must still be necessary and proportionate and comply with data protection principles under UK GDPR to ensure that individuals’ rights continue to be protected.
Amendment 50 would remove the Secretary of State’s ability to amend the list of recognised legitimate interests using regulations, and delete the list itself from schedule 4 of the Bill. We believe that the ability of the Secretary of State to amend the list is a necessary safeguard to future-proof the framework. This power is not unrestricted. It is subject to strict safeguards, including parliamentary approval by the affirmative resolution procedure and the requirement that any additions must serve public interest objectives under article 23(1) of the UK GDPR. Removing the power would make it more difficult to adapt the framework over time.
I have also considered amendment 53, which would prevent health data being processed under the new lawful ground, and I would argue that it conflicts with amendment 49. Health data would be processed under this new ground only in so far as that was necessary for one of the specified recognised legitimate interests in schedule 4 to the Bill. Any processing of health data would also have to meet one of the relevant criteria for processing specific category data in article 9 of UK GDPR and schedule 1 to the Data Protection Act 2018. On that basis, I hope that the hon. Lady will feel able to withdraw her amendments.
I rise to speak to amendment 53. I thank the Minister for his comments, and it is a pleasure to serve under your chairship again, Mr Turner. I support the words of my hon. Friend the Member for Harpenden and Berkhamsted. I am concerned not by the unchecked power of the Secretary of State under this regulation—we of course have precedents to go with under GDPR—but it is the specific and extremely high-grade personal health data that remains a concern for me.
I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss the following:
New clause 5—Parliamentary approval for changes to data safeguards—
“(1) Where the Secretary of State proposes to exercise any power under this Act to amend, vary or remove a safeguard relating to—
(a) recognised legitimate interests,
(b) automated decision-making,
(c) the definition of ‘special category’ data, or
(d) any data subject right provided under the UK GDPR,
the Secretary of State must lay before both Houses of Parliament draft regulations containing the proposed changes and an explanatory statement.
(2) The explanatory statement must include evidence from consultations with the Information Commission, data subjects, and relevant stakeholders, and an assessment of any impact on data adequacy with the European Union where they have occurred.
(3) The draft regulations referred to in subsection (1) are subject to affirmative procedure.”
This new clause would prevent ministerial changes being made to data safeguards without parliamentary debate and approval.
New clause 22—Statement on changes to recognise legitimate interest—
“(1) The Secretary of State must publish a statement outlining the purpose of any additions or variations to what constitutes a recognised legitimate interest if they exercise powers contained under section 70.
(2) This statement must reference—
(a) the purpose of the addition or variation,
(b) whether it is appropriate to specific data controllers, and
(c) the timeline for its relevance.”
This new clause would require the Secretary of State to publish a statement if they add or vary what constitutes a recognised legitimate interest under powers contained in section 70.
Schedule 4.
Clause 70 and schedule 4 introduce a new lawful ground for processing personal data under article 6 of UK GDPR, allowing organisations to process data for recognised legitimate interests without conducting a balancing test. The aim is to provide legal certainty and facilitate faster data sharing for key public interest purposes such as crime prevention and safeguarding.
New clause 5 sets out that any future changes to data safeguards, including those related to recognised legitimate interests, automated decision making and special category data, must be subject to greater scrutiny in Parliament. It is essential that significant changes to our data protection go through that democratic process. New clause 22 would require the Secretary of State to publish a statement explaining the purpose, scope and timeline of any additional variations to recognised legitimate interests, again supporting greater transparency and accountability.
The new clauses are there to help improve public trust around data and improve the adoption and inclusion of the growing range of technology and its benefits. They are for the Government to look at when thinking about how we ensure that we bring people along with us. They will help to maintain trust and ensure that we have safeguards in line with recognised legitimate interests. If we are to accept clause 70, the Government must again assure us that robust checks and balances are truly in place.
New clause 5 seeks to strengthen parliamentary oversight of regulations that could amend the recognised legitimate interest list or the provisions on automated decision making and special category data, which we will debate in detail later. The regulation-making powers in the clause are already subject to appropriate safeguards, and have been designed with retaining our EU adequacy decisions at the forefront of our mind—we will return to the topic of EU adequacy in more detail when we reach new clause 2. It is important to respect the European Commission’s processes and its discretion in how its adequacy assessment is undertaken.
New clause 22 would require the Secretary of State to publish a statement explaining the purpose of any changes to the recognised legitimate interest list, to whom they would apply and for how long. As I think I explained earlier on a previous amendment, any draft regulations laid before Parliament under these provisions will be accompanied by an explanatory memorandum, which would lay out precisely the things the hon. Lady seeks. On that basis, I hope she will not push her two new clauses to a vote.
Question put and agreed to.
Clause 70 accordingly ordered to stand part of the Bill.
Schedule 4 agreed to.
Clause 71 ordered to stand part of the Bill.
Schedule 5 agreed to.
Clauses 72 to 76 ordered to stand part of the Bill.
Clause 77
Information to be provided to data subjects
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss clauses 78 and 79 stand part.
Clause 77 strongly risks bypassing individual data rights by watering down transparency of the processing of personal data for reasons such as research, archiving in the public interest or statistical purposes. Although we welcome efforts to modernise data law, modernisation should not come at the cost of transparency—a cornerstone of public trust. Clause 77 risks seriously watering down rights regarding how our data is collected and processed.
Organisations such as Justice, the British Medical Association and the National Data Guardian share our concerns that this clause erodes transparency regarding the use of patient information. The National Data Guardian has specifically warned that weakening transparency obligations could negatively impact people’s trust in how our health and social care data is used for research, potentially impairing important data initiatives. Those concerns are also shared by the British Medical Association.
While we still face a crisis in the provision of health and social care, I strongly welcome all possibilities to unlock innovation to achieve healthcare benefits, as already discussed today. I welcome that innovation for my constituent Catherine in Harpenden, who was diagnosed with ovarian cancer, which is often overlooked in women; for Andy and Sarah in Redbourn, who lost their daughter at a painfully young age to a brain tumour; and for the families and others across Harpenden and Berkhamsted who are awaiting diagnosis and treatment. It is also for them, and to ensure that we unlock those healthcare benefits by ensuring that more data is used, that I caution against clause 77.
The clause introduces exemptions from providing information to data subjects where that would require “disproportionate effort”. However, it is hard to see how that can improve transparency. That failure to be transparent risks that loss of public trust—a message I will repeat again and again. I urge Ministers to listen to those concerns and to remove clause 77 to ensure that we have genuine public trust and crucial data protection standards, as already discussed today.
I am afraid I do not accept the characterisation the hon. Lady has put on the clause. There is currently an exemption from notifying data subjects if it would constitute a disproportionate effort to do so or prove impossible—obviously, there are cases where it is impossible to notify somebody. At the moment, that exemption can be used only when data was not collected directly from the data subject. Clause 77 will create a new exemption for when data was collected directly from the data subject. It will be limited to processing only for research purposes, and only when there is a change of purpose.
We believe that that will help longitudinal studies that originally obtained data from data subjects but that would struggle to notify them about a change in purpose of the study—for instance, due to having lost contact over years or due to a deterioration in the data subject’s condition. The clause is therefore essential to our research facilities in the UK, and I urge Members to support it.
Question put and agreed to.
Clause 77 accordingly ordered to stand part of the Bill.
Clauses 78 and 79 ordered to stand part of the Bill.
Clause 80
Automated decision-making
I beg to move amendment 51, in clause 80, page 95, line 19, at end insert—
“(3) To qualify as meaningful human involvement, a review must be performed by a person with the necessary competence, training, authority to alter the decision and analytical understanding of the data.”
This amendment would make clear that in the context of new Article 22A of the UK GDPR, for human involvement to be considered as meaningful a review must be carried out by a competent person.
With this it will be convenient to discuss new clause 23—Definition of meaningful human involvement in automated decision-making—
“The Secretary of State must, in conjunction with the Information Commission and within six months of the day on which this Act is passed, produce a definition of what constitutes meaningful human involvement in automated decision-making or clearly set out their reasoning as to why a definition is not required.”
This new clause would require the Secretary of State to produce a definition of meaningful human involvement in automated decision-making, in collaboration with the Information Commission, or clearly set out reasoning why this is not required.
Amendment 51 directly addresses the ambiguity around meaningful human involvement in automated decision making. Clause 80 raises significant concerns around the future of automated decision making. Although the Liberal Democrats recognise the potential of data to drive innovation, we must remember that, behind those automated decisions, are people’s lives. Amendment 51 directly addresses that ambiguity and seeks to ensure that any human review is
“performed by a person with the necessary competence, training, authority to alter the decision and analytical understanding of the data.”
That would help to clarify the crucial definition of “meaningful”.
I am grateful to the hon. Lady because automated decision making and the precise definition of “meaningful human involvement” were a key issue in previous versions of the Bill, and she will know that we have changed this version from the previous one.
Our reforms make it clear that decisions
“based solely on automated processing”
are ones that lack meaningful human involvement. The terminology we have introduced in this version of the Bill goes beyond the current UK GDPR and Data Protection Act wording to prevent cursory human involvement being used to, effectively, rubber-stamp decisions that have been made by automated decision processes.
The point at which human involvement becomes meaningful is of course context-specific, which is why we have not sought to be prescriptive in the Bill. The Information Commissioner’s Office already sets out in its guidance its interpretation that meaningful human involvement must be active. Someone—a human being—must review the decision and have the discretion to alter it before it is applied. The Government’s introduction of the term “meaningful” to primary legislation does not change that definition, and we are supportive of the ICO’s guidance in this space.
As such, the Government agree on the importance of the ICO continuing to provide its views on the interpretation of terms used in the legislation. Our reforms do not remove the ICO’s ability to do that or to advise Parliament or the Government if it considers that the law needs clarification. Broadly speaking, I agree with the hon. lady, which is why I do not think her amendment is necessary.
I thank the Minister for those points. I am happy to withdraw the amendment, but it is important to highlight that that person needs to have the necessary competence, training and authority around those decisions, so that that meaningful human involvement is very clear.
Before seeking permission to withdraw the amendment, do you want to speak to new clause 23?
New clause 23 mandates that the Secretary of State, in conjunction with the Information Commissioner, must produce a clear definition of meaningful human involvement within six months or justify why a definition is not required. Again, that is to ensure that the term “meaningful human involvement” is not just a hollow gesture. I welcome the Minister ’s remarks clarifying the issue, but it is crucial, if we are to get automated decision making right, that these things are defined and clear for everyone involved.
I will briefly add that the Information Commissioner already provides guidance in this sphere. We do not want to prevent the Information Commissioner from doing that in the future. The hon. Member is absolutely right: it has to be meaningful human involvement, and we think we have the balance right in the Bill.
I beg to ask leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss the following:
Schedule 6.
New clause 1—Requirements of public sector organisations on use of algorithmic or automated decision-making systems—
“(1) No later than the commencement of use of a relevant algorithmic or automated decision-making system, a public authority must—
(a) give notice on a public register that the decision rendered will be undertaken in whole, or in part, by an algorithmic or automated decision-making system,
(b) make arrangements for the provision of a meaningful and personalised explanation to affected individuals of how and why a decision affecting them was made, including meaningful information about the decision-making processes, and an assessment of the potential consequences of such processing for the data subject, as prescribed in regulations to be made by the Secretary of State,
(c) develop processes to—
(i) monitor the outcomes of the algorithmic or automated decision-making system to safeguard against unintentional outcomes and to verify compliance with this Act and other relevant legislation, and
(ii) validate that the data collected for, and used by, the system is relevant, accurate, up-to-date, and in accordance with the Data Protection Act 2018, and
(d) make arrangements to conduct regular audits and evaluations of algorithmic and automated decision-making systems, including the potential risks of those systems and steps to mitigate such risks, as prescribed in regulations to be made by the Secretary of State.
(2) ‘Algorithmic decision system’ or ‘automated decision system’ mean any technology that either assists or replaces the judgement of human decision-makers.
(3) Regulations under this section are subject to the affirmative resolution procedure.”
This new clause would require the public sector to provide increased transparency to service users in informing when automated decision making had been used. It would also require them to develop processes to monitor the outcomes from utilising automated decision-making.
New clause 4—Register of algorithmic tools used in public sector decision-making—
“(1) The Secretary of State must establish and maintain, or arrange for the establishment and maintenance of, a public register of all automated or semi-automated systems used by public authorities to make, or materially influence, decisions affecting the rights, entitlements or legitimate expectations of individuals.
(2) A public authority that uses or intends to use an automated or semi-automated decision-making tool must notify the Secretary of State of—
(a) the name or brief description of the tool,
(b) the decision or class of decisions in which it is used,
(c) the nature and source of the data used by the tool, and
(d) details of any meaningful human review required by law or policy.
(3) The information set out in subsection (2) must be submitted to the register prior to the deployment of any automated or semi-automatic decision-making tool by a public authority.
(4) The Secretary of State must, within six months of the passage of this Act, publish guidance on compliance with this section, following consultation with the Information Commission.”
This new clause would create a publicly accessible register for AI and algorithmic tools in the public sector.
New clause 7—Information regarding high-risk AI decisions—
“(1) Where a decision based wholly or partly on automated processing, including AI or machine learning, has a legal or similarly significant effect on a data subject, that data subject has the right to request the following information from a data controller—
(a) an explanation of the reasons and criteria used by the automated processing to reach the decision,
(b) a description of the principal factors or features that most significantly influenced the outcome, and
(c) information about process for appeal, or request human review of, that decision.
(2) In this section, ‘legal or similarly significant effect’ includes decisions affecting an individual’s access to credit, employment, insurance, healthcare, social security, or other key public or private sector services.
(3) The Secretary of State must by regulations define the criteria and thresholds for ‘high-risk AI decisions’ to which this section applies, following consultation with the Information Commission, technical experts, civil society bodies, and such other persons as the Secretary of State considers appropriate.
(4) Regulations under subsection (3) are subject to the affirmative resolution procedure.”
This new clause would give individuals a right to obtain from data controllers an explanation of the key factors determining an AI outcome and provide a mechanism to appeal or request a human review of “high-impact” automated decisions.
New clause 24—Register of algorithmic tools used in public sector decision-making—
“(1) The Secretary of State must establish and maintain, or arrange for the establishment and maintenance of, a public register of all automated or semi-automated systems used by public authorities to make, or materially influence, decisions affecting the rights, entitlements or legitimate expectations of individuals.
(2) A public authority that uses or intends to use an automated or semi-automated decision-making tool must notify the Secretary of State of—
(a) the name or brief description of the tool;
(b) the decision or class of decisions in which it is used;
(c) the nature and source of the data used by the tool; and
(d) details of any meaningful human review required by law or policy.
(3) A public authority must not deploy an automated or semi-automated decision-making tool unless the information specified in subsection (2) has been submitted to the register.
(4) The Secretary of State must publish guidance on compliance with this section, following consultation with the Information Commission.”
This new clause would create a publicly accessible register for AI and algorithmic tools in the public sector.
New clause 26—Provision of explanations for high-risk AI decisions—
“(1) Where a decision based wholly or partly on automated processing, including AI or machine learning, has a legal or similarly significant effect on a data subject, that data subject has the right to obtain from a data controller, on request—
(a) a concise, explanation of the reasons and criteria used by the automated processing to reach the decision;
(b) a description of the principal factors or features that most significantly influenced the outcome; and
(c) meaningful information about how to appeal, or request human review of, that decision.
(2) In this section, ‘legal or similarly significant effect’ includes (but is not limited to) decisions affecting an individual’s access to credit, employment, insurance, healthcare, social security, or other key public or private sector services.
(3) The Secretary of State must by regulations define the criteria and thresholds for ‘high-risk AI decisions’ to which this section applies, following consultation with the Information Commission, technical experts, civil society bodies, and such other persons as the Secretary of State considers appropriate.
(4) Regulations under subsection (3) are subject to the affirmative resolution procedure.”
This new clause would give individuals a right to obtain from data controllers an explanation of the key factors determining an AI outcome and provide a mechanism to appeal or request a human review of high-impact automated decisions.
As we have several provisions here, it might be worth me making some comments. Clause 80 strikes the right balance between helping organisations to make the most of emerging technologies that drive up economic growth and productivity, while maintaining public confidence. Organisations will be able to make decisions that have significant effects for individuals based solely on automated processes in wider circumstances than presently, but they must implement stringent safeguards.
Those safeguards include individuals’ right to challenge and obtain human intervention if they are not satisfied with the decision. Where law enforcement agencies process personal data for a law enforcement purpose, they will be able to apply an exemption to the safeguards in very limited circumstances, such as to safeguard national security. Where that happens, a human must reconsider that decision as soon as reasonably practicable after it is taken, and that review must be meaningful. For the intelligence services, where entirely automated decision making is used, we are making clarifications to provide greater confidence to controllers and the public.
Schedule 6 contains minor and consequential amendments to UK GDPR and the Data Protection Act. Those amendments repeal and replace references to the current rules on automated decision making with the reformed rules in clause 80 of the Bill to provide legal clarity.
On new clauses 1, 4 and 24, the Government’s algorithmic transparency recording standard, or ATRS, enables public authorities to publish information on how and why they are using algorithmic tools. That includes a description of the human role in the wider operational process of which the tool is part. More than 50 ATRS records are now published in the repository, with more to follow shortly. Where these amendments seek to ensure that such tools are evaluated, the blueprint for modern digital government, which was laid in Parliament in January, makes it clear that part of its role will be to offer specialist assurance support, including a service to rigorously test models and products before release. I hope this provides reassurance to the hon. Member for Harpenden and Berkhamsted.
I will address new clauses 7 and 26 together, as they intend to achieve the same effect. I would again like to reassure hon. Members—I am trying to offer a lot of reassurance to the hon. Lady and to the hon. Member for North Norfolk—that the data protection framework has stringent safeguards in place for solely automated decision making. The UK GDPR transparency obligations already require organisations to notify individuals about the existence of solely automated decision making and to provide meaningful information about the logic involved.
Under our reforms, after a decision has been made, organisations must also provide data subjects with information about that decision. These information requirements enable individuals to exercise the safeguards I mentioned earlier, and are in addition to the wider transparency requirements of the framework. The safeguards in the reformed article 22 are unnecessary for partly automated decision making, since these decisions already include meaningful human involvement by definition.
When it comes to high-risk decisions, there are already additional requirements for processing that can result in a high risk to the rights and freedoms of individuals, including automated decision making. Controllers must carry out an impact assessment for such processing activities and consult the Information Commissioner’s Office where such an assessment indicates a high risk to individuals in the absence of effective measures. That process ensures that potential risks are identified and addressed. I therefore hope that the hon. Members for Harpenden and Berkhamsted and for North Norfolk feel that they do not need to press their new clauses to a vote.
This is a really interesting set of clauses. The debate that we are having, and that they had in the other place, on automated decision making is particularly interesting.
Of course, automated decision making is here. It is everywhere in every part of our lives. I was just looking at the Spotify app on my phone, and my daylist suggested that I start off by listening to Front Line Assembly, which is an industrial metal band. It is probably because, going into the data Bill Committee on a Tuesday morning, I need something to get me started.
Don’t get me started.
Ah, the temptation.
The point about automated decision making is that decisions can be made about people without their involvement, which can have substantial consequences, particularly when decisions are made in the public sector. Of course, automated decision making takes place all the time, day in and day out. We saw it on a huge scale during the covid pandemic when people had to sign up to universal credit, as there were checks on people’s basic characteristics and assets to assess their eligibility. Those decisions had quite a substantial impact on people, but the data points were actually quite simple and straightforward. As I understand it, no particular concern was raised about automated decision making in that context, apart from a few examples where people’s circumstances were remarkably unique.
The purpose of the Bill is to scale up, particularly when it comes to AI and more sophisticated decision making. We think the new clauses seek to place onerous and unnecessary obligations on Government bodies and public authorities. New clause 1, among other things, would require public bodies to give notice on a public register for each and every function they perform where automated decision making is used. New clause 4 would oblige the Secretary of State to put in place and maintain a public register of all semi-automated or fully automated decision-making tools used by public authorities in relation to individuals, with public bodies being precluded from using such tools in advance of their registration.
New clause 7 would provide service users with a range of powers to request information from data controllers in relation to the workings of wholly or partly automated decisions, and to purportedly high-risk decisions. I am amazed by the provisions suggested in new clauses 7 and 26 by the hon. Members for Harpenden and Berkhamsted and for North Norfolk. They seem to give quite extensive discretionary powers to the Secretary of State to define high-risk decisions and regulate them accordingly. I was surprised to see the extent of the powers handed over to the Secretary of State by these new clauses.
The new clauses are especially burdensome and unnecessary, given the widespread use of automated decision making for initial assessments in the Department for Work and Pensions, as I alluded to earlier. Automated decision making is a subject of significant debate, and clause 80 includes more safeguarding measures in proposed new article 22C of the UK GDPR. Those safeguards include requirements on data controllers to provide information to data subjects about significant decisions being taken through solely automated processing, the right to contest those decisions and the right to seek human intervention at the request of the data subject.
Our view is that clause 80, as drafted, provides a proportionate approach to the protection of individual rights, and that these tools will help with potential speed and, importantly, efficiency gains, which I mentioned earlier with regard to the use of automated decision making for universal credit in the response to covid. The basic functions of Government and public authorities rely on widespread automated decision making.
I like to think that the Liberal Democrats are aligned on automated decision making and AI being an exciting opportunity for this country, but it is a fundamental shift. It is our job as a constructive Opposition to put forward proposals for consideration, so that there is proper scrutiny and ideas to ensure we take the public with us. All our proposals are made in that light.
Behind these decisions are real people’s lives. I have a family in Wheathampstead who are desperate for a social home to fit their family life. I have someone from Tring—I will call him John—who says he has been systematically failed since high school. His universal credit claim was closed because he did not upload the correct documents, and he feels that a lack of human understanding of why he was unable to do so was a barrier to getting the help he needed. Another case involves HMRC, with a constituent being taxed on pension income he is not actually receiving—it was only after hours of support from my team and the Department that we were able to turn his case around.
The point is that it will be very difficult for people to fix errors that result from automated decision making if there is not enough transparency. We have to get that balance right. The Secretary of State’s power to define crucial terms such as “meaningful human involvement” and “similarly significant effect” creates uncertainty and the potential for watering down protections.
Clause 80 seeks to update the safeguards around automated decision making, but organisations such as Justice and the Open Rights Group have real worries that it weakens those vital safeguards by broadening the scope for purely automated decisions. Furthermore, the considerable powers granted to the Secretary of State to amend or set aside the safeguards through secondary legislation create a concerning lack of parliamentary oversight.
It is with these reservations in mind, and as a constructive Opposition, that we tabled these new clauses. New clause 1 would place a clear requirement on public sector organisations regarding the use of algorithmic automated decision making. It would demand public notice of their use and the provision of meaningful and personalised explanations to affected individuals about how and why a decision was made. I have already raised some cases that the provision would impact, and I am sure we all have similar cases in our inboxes. New clause 1 directly tackles the concerns about transparency and the right to an explanation—a principle that the Liberal Democrats firmly believe in.
New clause 4 proposes the creation of a publicly accessible register of AI and algorithmic tools in the public sector. New clause 7 seeks to empower individuals by granting them the right to request comprehensive information on decisions based wholly or partly on automated processes that have a legal or similarly significant effect on them. This aligns with the Liberal Democrats’ commitment to ensuring accountability and redress.
New clause 24 proposes the establishment of a public register of all automated or semi-automated systems used by public authorities to make or materially influence decisions. These measures would help to promote transparency and enable public scrutiny, which is something that this House and its Members are here to do, either through legislation or for our constituents on a daily basis. It is important that we ask questions about these increasingly influential technologies.
Finally, new clause 26 would reinforce the right to an explanation by requiring data controllers to provide, on request, clear explanations of high-risk AI decisions. We need to ensure that meaningful human involvement remains central.
The new clauses are not intended to stifle innovation, but to ensure that the evolution of data use is guided by principles of fairness, transparency and accountability. We must ensure that data truly is the new gold for everyone, and that individuals’ rights and protections are not devalued in the process. Indeed, innovation is often better promoted by guidelines. The new clauses represent our commitment to ensuring that clause 80 truly modernises our data laws without sacrificing the fundamental safeguards that protect everyone, including the people behind those decisions. I thank the Minister for his consideration.
I am not sure that I have a great deal to add. I feel caught between the rocks of “You are going too far” and “You are not going far enough,” which suggests to me that what we are advocating is probably proportionate and sensible, so I resist the new clauses and support clause 80.
I simply add that I do not have Spotify on my phone because I choose not to stream—I prefer to pay for my music—but I note that my ministerial podcast is available on Spotify.
Question put and agreed to.
Clause 80 accordingly ordered to stand part of the Bill.
Schedule 6 agreed to.
Clause 81
Data protection by design: children’s higher protection matters
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to consider new clause 21—Age of consent for social media data processing—
“(1) The UK GDPR is as amended as follows.
(2) In Article 8 of the UK GDPR (Conditions applicable to child’s consent in relation to information society services)
After paragraph 1 insert —
‘(1A) References to 13 years old in paragraph 1 shall be read as 16 years old in the case of social networking services processing personal data for the purpose of delivering personalised content, including targeted advertising and algorithmically curated recommendations.
(1B) For the purposes of paragraph 1A “social networking services” means any online service that—
(a) allows users to create profiles and interact publicly or privately with other users, and
(b) facilitates the sharing of user-generated content, including text, images, or videos, with a wider audience.
(1C) Paragraph 1B does not apply to—
(a) Educational platforms and learning management systems provided in recognised educational settings, where personal data processing is solely for educational purposes.
(b) Health and well-being services, including NHS digital services, mental health support applications, and crisis helplines, where personal data processing is necessary for the provision of care and support’”.
This new clause would raise the age for processing personal data in the case of social networking services from 13 to 16.
I think the hon. Member for Runnymede and Weybridge thinks I was trying to calumniate him earlier by suggesting that he does not pay for his music, but I was not making that suggestion at all.
Clause 81 amends article 25 of the UK GDPR to strengthen the obligations for providers of information society services likely to be accessed by children, such as social media platforms and streaming services, by requiring them to actively consider the protection of children’s data when designing their services, and to ensure that appropriate organisational and technical measures are in place to safeguard young users.
The clause sets out the higher protection matters that ISS providers must consider when designing their services, including how best to protect and support children when processing their personal data. The duty also requires data controllers to consider the fact that children merit specific protection because they may be less aware of the risks and their rights, and the fact that children’s needs vary with age and development. When considering how to comply with the new duty, ISS providers will be greatly aided by the age-appropriate design code issued by the Information Commissioner’s Office, which sets out comprehensive guidance for ISS providers on using personal data in a way that complies with data protection principles and respects the best interests of children.
New clause 21, in the name of the hon. Member for Harpenden and Berkhamsted, would in effect increase the age of consent for social media from 13 to 16. I am happy to listen to her comments, but much of this was debated in Friday morning’s private Member’s Bill debate—I think that is partly what led her to table the new clause—and in the Westminster Hall debate a few weeks ago. I am happy to respond to her comments, but we will be resisting her new clause.
At the heart of new clause 21 is the aim to restrict access to certain social media platforms by children under the age of 16. I am very sympathetic to the motivations of the hon. Member for Harpenden and Berkhamsted.
I just want to clarify that the new clause is about the age of data consent, not access to social media. I will cover that in my speech, but I want to ensure that is clear.
I thank the hon. Lady for that clarification. Nevertheless, I think there is an overlap with the debate that took place in the Chamber last Friday on the ballot Bill introduced by the hon. Member for Whitehaven and Workington (Josh MacAlister), which has been dubbed the safer phones Bill. We all took part in that debate.
During Friday’s debate, I highlighted the mounting research and evidence on the harm to adolescents’ mental health from social media platforms. Policy in this area needs to be developed based on the best data and evidence. In that regard, we also need to learn from the roll-out of the Online Safety Act 2023 and the implementation of Ofcom’s guidance on putting in place age assurance in relation to particular types of content and data.
I welcome reports that the Government intend to commission the chief medical officer to review the harms associated with this area. It is disappointing that the Minister did not give a firm commitment on that from the Dispatch Box last Friday. Will he take the opportunity to do so today?
I start with the young people at the heart of this. We have indeed debated this time and again, and it is important that we keep that going. I have been on a Safe Screens tour, speaking to young people across Harpenden and Berkhamsted—from Ashlyns school in Berkhamsted to St George’s school in Harpenden, and beyond. Girl guides have spoken to me of their concerns about online bullying and harmful content. When I visited local schools on my Safe Screens tour, young men told me that social media has shown them extreme content that they do not want to see, and despite their efforts to block or ban it, the algorithm brings it back. Young women have told me of their worries about the growing misogyny in online content. Together, young women and men alike have talked about the impact on their body image, bullying and the amount of time sucked up by social media—time they want to spend doing other things.
There is mounting evidence of the negative effects of our children’s exposure to an unsafe online environment. Every day, we hear the concerns of parents, the anxieties of teachers and, crucially, the voices of young people themselves, all crying out for action to ensure their safety online and calling for stronger regulation of their online experience. I realise that the Online Safety Act is moving forward, but it will not go far enough.
This is a matter of profound importance for the wellbeing of our nation’s children and young people. The digital world is now the fabric of our lives and the lives of young people—it is where they learn, connect and seek entertainment—and yet, as the digital landscape expands, so do the risks they face. As 5Rights highlights,
“the enormous potential of technology can only be realised when it is designed with children in mind”.
Parliament now has an opportunity, perhaps even an obligation, to help shape that technology. It is disappointing that the safer phones Bill was watered down, so we have tabled new clause 21 as a crucial step to strengthen protections for young people in the digital age. It seeks to bring UK data law in line with many of our European neighbours, by raising the minimum age at which internet companies can collect, process and store a user’s data without explicit parental consent to 16.
New clause 21 is not about banning children under the age of 16 from accessing social media. It is important that the responsibility lies with the social media companies to ensure that online spaces are age-appropriate, and to manage content and features. This new clause is a targeted and proportionate measure to require online services to change fundamentally how they handle children’s data. This crucial change will necessitate restricting the pervasive and often harmful influence of algorithms to make the platforms inherently less addictive and ultimately foster a more child-friendly digital environment.
It is imperative that digital technology is designed with the best interests of our children and young people at heart, ensuring that their rights and privacy are upheld—not as an afterthought, but with safety by design and by default. Furthermore, new clause 21 underscores the fundamental principle that the responsibility for managing content and features must remain firmly with the social media companies. We call for the establishment of robust digital standards to ensure that platforms prioritise child safety and privacy as their default setting, rather than as an optional extra.
Our duty to protect children online, however, cannot end with new clause 21. We call on the Government to take further action, such as explicitly recognising and treating children’s use of social media as a critical public health issue, thoroughly examining international best practice, identifying innovative solutions to the challenges we face, and developing evidence-based policies that will demonstrably improve online safety for children. I welcome the Government’s proposal last week to introduce a cross-Government safer screens taskforce to look at research into the multifaceted impact of social media on children’s wellbeing.
This is not an overnight phenomenon. Last year marked 20 years since Facebook was created and 18 years since the smartphone was launched. We now face mounting calls from young people, parents and teachers. Now is the time for greater action on the gaps left by the Online Safety Act. I urge the Government to recognise the urgent need for decisive action and to support new clause 21. This is a pivotal opportunity to strengthen protections for young people online and to cultivate a digital world where they can thrive, not merely navigate potential harms. Let us work together across party lines to ensure a safer and brighter digital future for all our children.
It is a pleasure to speak under your chairship, Mr Turner.
I have some sympathy with what the hon. Lady is trying to do with new clause 21. I invite the Minister to address, as I am sure he will, her very specific point that it is not about restricting access to social media for people aged between 13 and 16, but about further restricting how social media is targeted at those age groups. After visiting schools in my constituency, I am minded to support her argument. At Ryde academy, I heard the reaction of young people to their school restricting access to their phones, which a school can only do during the school day.
I am interested to hear the Minister’s response to the quite reasonable proposition on how to restrict social media companies from targeting children aged 13 to 16. Restricting the way they collect their data sounds like a small, but meaningful, step that the Government could take.
I thought the debate might stray a little from the precise point my hon. Friend just made to into broader issues about young people’s use of smartphones and social media in general, and inevitably it has. This is a very live issue, of which the Government are painfully aware, as I tried to say in the debate last Friday. We are trying to work out the best way forward to a place where the rules that were implemented in the Online Safety Act, some of which only come into force in the spring—some have yet to get parliamentary approval; that will probably come in April—bed in properly and are adhered to before we take further action. That is simply because it is difficult to know otherwise whether the Online Safety Act has gone far enough. I wish that the Online Safety Act had been not the Online Safety Act 2023, but the Online Safety Act 2021 or 2019. Had it been, we might now be in a position to assess its effects.
In the debate on Friday I referred to the feasibility study commissioned by the Department for Science, Innovation and Technology from the University of Cambridge, which is working with other researchers. Work on that will finish in May, so I hope to have further information then. There is already a lot of guidance out there for parents, teachers and schools, but if we need to provide more informed guidance in the future, we will certainly look into that.
The precise targeting of young people by algorithms and so on is, at least theoretically, already dealt with by the Online Safety Act. That is why I do not think that new clause 21 is an appropriate measure to take forward at this time. Whether the age limit should be 13 or 16 was decided by the previous Government and agreed by Parliament after quite extensive consultation during the passage of the Data Protection Act 2018. All these matters have to be kept under review, and we are doing so in an urgent way, not least because a lot of Members are bringing up concerns raised by their constituents, families and so on.
I sometimes worry that the concentration solely on schools is to the detriment of the wider issues we are facing, not least because, as teachers readily admit, they only ever see children for about 20% of their time, and we need to set this in a wider context. I understand the motivation behind the new clause, but I will still resist it.
Question put and agreed to.
Clause 81 accordingly ordered to stand part of the Bill.
Clauses 82 to 85 ordered to stand part of the Bill.
Schedules 7 to 9 agreed to.
Clauses 86 to 88 ordered to stand part of the Bill.
Clause 89
Joint processing by intelligence services and competent authorities
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss the following:
Clause 90 stand part.
New clause 8—Intelligence services and law enforcement data-sharing: enhanced oversight—
“(1) Where intelligence services or law enforcement bodies share personal data under this Act, the Commissioner must review the nature, purposes, and scope of the data-sharing annually and produce a report assessing—
(a) the necessity and proportionality of the data-sharing,
(b) any impact on individuals’ rights, and
(c) any recommendations for improved safeguards or remedial measures.
(2) The Commissioner must lay the report under subsection (1) before both Houses of Parliament, subject to any redactions strictly necessary on national security grounds.
(3) This section does not authorise or require the Commissioner to disclose information that could jeopardise national security, but the Commissioner must endeavour to provide the maximum transparency compatible with security requirements.
(4) The Secretary of State must respond to each report within 90 days of their laying before Parliament, addressing any recommendations made by the Commissioner.”
This new clause would require the Information Commission to review, annually, data-sharing by intelligence and law enforcement services, and to publish a report of its findings.
I rise mainly to speak to new clause 8, which was tabled by the hon. Member for Harpenden and Berkhamsted. It would require the Information Commissioner to review and publish an annual report on data sharing by law enforcement and the intelligence services. Any processing of personal data by law enforcement and the intelligence services must, by definition, be lawful, which inherently includes the requirement that the processing be necessary and proportionate.
The Information Commissioner is obliged to provide an annual report to Parliament under the Data Protection Act 2018, and may also produce additional reports as they see fit. The commissioner has powers of investigation, monitoring and enforcement that they may exercise as appropriate to oversee law enforcement agencies and intelligence services. Given the current system of oversight, and the fact that the commissioner is already required to provide reports annually, I hope the hon. Member for Harpenden and Berkhamsted will not press her new clause to a vote.
I am sure members of the Committee will be pleased to hear that I have a very short speech. Clause 89 permits the processing of personal data jointly between law enforcement and intelligence services for the crucial purpose of safeguarding national security. While the Liberal Democrats recognise the necessity of such collaboration in specific circumstances, the inherently sensitive nature of the data demands the highest level of protection and scrutiny. New clause 8 is designed to ensure that ministerial changes to the data safeguards on this joint processing cannot be enacted without proper scrutiny by Parliament. We would also welcome hearing from the Government on Report proposals to uphold and ensure that parliamentary scrutiny.
Question put and agreed to.
Clause 89 accordingly ordered to stand part of the Bill.
Clauses 90 and 91 ordered to stand part of the Bill.
Clause 92
Codes of practice for the processing of personal data
Question proposed, That the clause stand part of the Bill.
With this it will be convenient to discuss the following:
New clause 27—Secretary of State’s powers in relation to the Information Commission—
“(1) Prior to issuing any guidance or statement of strategic priorities to the Information Commission, the Secretary of State must consult—
(a) the Science, Innovation and Technology Committee of the House of Commons, and
(b) the Senedd, Scottish Parliament, or Northern Ireland Assembly where guidance relates to devolved powers.
(2) The Secretary of State may not reject or change a Code of Practice prepared by the Information Commission without a resolution to do so from each House of Parliament.”
This new clause seeks to strengthen the regulator’s independence by limiting ministerial powers to direct or override the Information Commission without oversight.
Clause 93 stand part.
I rise to speak primarily to new clause 27. The Government are committed to the independence of the regulator, which will continue to be accountable to Parliament. The Bill does not introduce a statement of strategic priorities, as the hon. Member for Harpenden and Berkhamsted seems to suggest in her new clause, nor does the Secretary of State issue discretionary guidance to the regulator. Neither the Bill nor existing legislation provides for the Secretary of State to amend or reject a code of practice. I therefore do not see that this new clause is necessary.
I have a specific point on the text of new clause 27. We do not support this curtailment of powers, particularly in this context. I am sure that the Secretary of State will consult as a matter of course in carrying out his duties, whatever they are, and a specific requirement to consult the Science, Innovation and Technology Committee, notwithstanding its prowess and stature, would be inoperable.
For us, the amendment is again in the spirit of ensuring proper scrutiny. However, I welcome the Minister’s comments and am happy not to press it.
Question put and agreed to.
Clause 92 accordingly ordered to stand part of the Bill.
Clauses 93 and 94 ordered to stand part of the Bill.
Clause 95
Regulations under this Part: Parliamentary procedure and consultation
I beg to move amendment 22, in clause 95, page 120, line 31, leave out subsection (1).
This amendment removes a subsection which was inserted at Report stage in the Lords.
With this it will be convenient to discuss the following:
Clause stand part.
Clauses 135 to 139 stand part.
New clause 17—Statement on application of the Copyright, Designs and Patents Act 1988 to activities by web-crawlers or artificial intelligence models—
“The Secretary of State must, within three months of Royal Assent, issue a statement, by way of a copyright notice issued by the Intellectual Property Office or otherwise, in relation to the application of the Copyright, Designs and Patents Act 1988 to activities conducted by webcrawlers or artificial intelligence models which may infringe the copyright attaching to creative works.”
New clause 18—Report on regulation of web-crawlers and artificial intelligence models on use of creative content—
“The Secretary of State must, within three months of Royal Assent, lay before Parliament a report which includes a plan to help ensure proportionate and effective measures for transparency in the use of copyright materials in training, refining, tuning and generative activities in AI.”
New clause 19—Report on reducing barriers to market entry for start-ups and smaller AI enterprises on use of and access to data—
“The Secretary of State must, within three months of Royal Assent, lay before Parliament a report which includes a plan to reduce barriers to market entry for start-ups and smaller AI enterprises on use of and access to data.”
New clause 20—Publication of a technological standard—
“The Secretary of State must, within 12 months of Royal Assent, publish a technological standard for a machine-readable digital watermark for the purposes of identifying licensed content and relevant information associated with the licence.”
Clause 95(1) relates to artificial intelligence and copyright and the related reporting requirement on the Information Commissioner. I will set out our fuller reasoning on copyright shortly, but for now, I merely state that we believe that now is not the right time to place an additional reporting obligation on the Information Commissioner in the Bill. Government amendment 22 would therefore remove subsection (1), which was added to the Bill on Report in the Lords.
Subsection (2) requires the Information Commissioner to publish key performance indicators. The Government take the performance and accountability of UK regulators seriously. Regulators undertake important functions across our economy and society, and the Information Commissioner’s Office is no exception, so it is important that Parliament and other stakeholders have the right means to hold them to account. Publishing such metrics is best practice and a transparent way of providing measurable analysis of the ICO’s performance year on year. This forms part of a package of additional reporting requirements, alongside those in clause 91 and the report on regulatory action in clause 102, which we will discuss later. Together, they will increase understanding and transparency about priorities, performance and enforcement activities.
Clauses 135 to 139 are in this group, although we will vote on them later. They set out further requirements regarding copyright and AI, and I urge that they do not stand part of the Bill. These clauses require regulations to be made to ensure compliance with copyright law by web crawlers and general AI models, and transparency about the use of web crawlers and the works they scrape. There is also an enforcement requirement for the Information Commissioner, and provision on a review of technical solutions.
The Committee well knows, because I have mentioned it several times, that I agree with many of the points raised in debate on the clauses and the importance of transparency for rights holders in the creative industries. It was one of the key principles in the consultation we published on copyright and AI that closed on 25 February. We want genuine transparency about what is used in training AI, alongside rights holders’ control of their work and appropriate access to training material for AI. However, although I accept the intention behind the amendments—that is, clauses 135 to 139, which were added in the Lords—we do not believe that the Bill, which is a data measure, is the right vehicle for action.
We received more than 11,000 detailed and heartfelt responses to our public consultation on AI and copyright. Many set out specific views on transparency, technical standards and a range of the questions that we asked in the consultation. We are taking care to read each response. Although we believe that action needs to be taken on transparency and web crawlers, as well as other issues relating to AI and copyright, it is only right that we carefully consider all the viewpoints and evidence before acting. We have heard loud and clear the message that stakeholders do not want us to rush to legislate on this topic, and we intend to heed this message. Yesterday the Secretary of State met representatives of the creative industries and repeated that point.
The engagement will not end with this consultation. I have already stated our intention to create working groups to move the conversation forward, including on technical solutions. Industry often comes up with the best ideas, so I want to harness that, whether it is greater transparency about AI training or standards on web crawlers, metadata and watermarking. Whatever the solution, we want to be confident in its efficacy and, critically, in its simplicity and accessibility. We have said repeatedly that we will not move forward in this sphere unless we are confident we can give rights holders greater control over the use of their works. Once we have analysed the responses to the consultation, we will publish proposals. Clauses 135 to 139, which were added to the Bill on Report in the House of Lords, should not stand part of the Bill.
Opposition new clauses 17 to 20 also relate to AI and copyright. I am grateful for the way in which the shadow Minister, the hon. Member for Runnymede and Weybridge, has advanced these suggestions. These new clauses would require the Secretary of State to issue a statement on the application of UK copyright law to the activities of web crawlers and artificial intelligence models. One of the new clauses sets out the possibility of this being a copyright notice, issued by the Intellectual Property Office. They also require the Secretary of State to produce two reports, one of which should include a plan for transparency regarding the use of copyright materials with AI; the other should be on access to data and how to reduce market entry barriers for start-ups and smaller AI enterprises. Finally, they would require the Secretary of State to publish a technical standard for a machine-readable digital watermark, which would allow rights holders to label their content for the purposes of licensing.
As the House is aware, these amendments relate to matters that are at the heart of the government’s approach to AI and copyright. Indeed, the recently closed consultation sought views precisely on these issues. I commend my Conservative colleagues on the spirit of these amendments, but ask that they not be pressed at this time. I hope we can have further discussions on how we move forward on Report. While the amendments may indeed offer useful steps for the Government to take in solving issues of AI and copyright, proper analysis and policy development is likely to take longer than the proposed timeframes included in the clauses, given the 11,000 responses we received to the consultation. In addition, we would not wish to pre-empt the consultation, which some of these proposed clauses may do.
That said, I hope that I can discuss the issues raised by the amendments further with Members from both Opposition parties, to discuss whether we might be able to agree on similar proposals as we move forward. As I say, we take these issues seriously. As I am the Minister for Creative Industries and a Minister in the Department for Science, Innovation and Technology, I take a special interest in this and agree with the spirit of what these amendments seek to achieve, but I ask that new clauses 17 to 20 not be pressed at this stage.
Equally I thank the Minister for the spirit in which he has conducted this debate. It is an important one, which has captured the focus of a great many people, not only those in the creative and AI sectors whose livelihoods absolutely depend on us getting this right, but all of us who love and benefit from the products of the UK creative industries.
I will discuss clauses 135 to 139, which the Government are seeking to remove, before I speak to the new clauses that stand in my name and those of other Opposition Members. While there are problems with clauses 135 to 139, they do reflect the creative industries’ well founded and much publicised concerns about the manner in which AI developers conduct data mining activities to train their models and for generative activities. I should declare an interest; I recently attended the Brit awards with my wife with support from the British Phonographic Industry, so I received hospitality in the last couple of weeks.
I want to recognise the tremendous efforts of the noble Baroness Kidron in the other place in bringing this incredibly important issue to the forefront of the minds of lawmakers and the public. My right hon. Friend the Leader of His Majesty’s Opposition has been clear about the importance of growing our domestic AI industry. These technologies have the potential to improve our lives and stimulate economic growth. That need not and must not come at the expense of our creative industries. Music, especially live music, is one of my passions. I know that the Minister shares my passion for our creative industries; indeed, his efforts to incorporate a thespian flourish into the debates we have had over the past few months do not go unappreciated —by some.
And are hated by others.
The Minister can interpret that how he wishes. We must ensure that our creative industry is supported to thrive while harnessing the technological and economic benefits of growing our domestic AI capability. The amendments in this area from the other place touch broadly on a number of important principles, which we support. They include the application of copyright law to data mining and AI-generative activity, which is covered in clause 135; the need for high transparency in how AI models are trained and the materials used for that purpose; and the importance of identifying sound technical solutions so that copyrighted content can be identified by web crawlers and AI models excluded from training and generative activity.
However, these matters and their solutions are complex from technological, legal and societal perspectives. We totally recognise the challenges of legislating in this area. Our international counterparts face the same challenges in trying to strike an effective balance between supporting our AI industries and supporting our creative sector. It is not a zero-sum game. There is clear scope for collaboration and mutual benefit if we get this approach right. As the shadow Secretary of State for Science, Innovation and Technology, my hon. Friend the Member for Havant (Alan Mak), stated in the Chamber, creative industries are telling us that the Government’s solution—set out in their consultation as their preferred option—is “not fit for purpose”. We need to engage carefully with the feedback received from industry stakeholders to come up with the right solutions in this area.
As the Minister mentioned, new clauses 17 to 20 recognise that putting together a specific plan in great detail on how to solve this problem needs to be done by chewing through the information from the ongoing consultation and through extensive engagement with the people who are actually going to do this stuff and the people who will be affected by it. In effect, the proposals are end-point clauses, about the world we want to see in the future, where our AI and creative sector can work together and we get all the benefits from both. It is where we want to get to. It is a bit less about putting together the journey plan of how we get there, with the exception of the fact that the Government need to crack on and sort this out—[Interruption.] Well, he’s the Minister. It comes with the job.
New clause 17 is intended to require the Government to confirm the application of existing copyright law to creative content mined by AI web crawlers and models in the same way that it would apply to other offline content. The Opposition feel that there is no ambiguity about the application of copyright law in this context. However, the Government have consistently suggested, in their consultation on copyright and artificial intelligence and elsewhere, that the legal position is unclear. In effect, that has created ambiguity and given rise to significant concern and objection from the creative sector. It deters smaller AI enterprises and start-ups from developing their products.
Running counter to that is the near existence of a possible future opt-out. I have heard evidence that AI companies are now thinking, “Let’s hang on before we buy licensed content, because we might get it for free anyway if this applies going forward.” If the aim was to give confidence to the AI sector, as was dealt with by the consultation, that has not happened either.
Clarity is needed for both the creative and AI industries, but not in the form of a wholesale exemption from copyright law for development and generative activity undertaken by AI models. The Government should make a clarificatory statement about the existing application of copyright law in this area. That certainty is the foundation block of sound policy aimed at supporting the creative industries to harness the economic value of their work, so I commend new clause 17 to the Committee.
New clause 18 requires the Government within three months of Royal Assent to lay before Parliament a plan to put forward proportionate and effective measures to ensure transparency in the use of copyright materials in training, development and generative activities of AI models, with emphasis on the proportionate and effective nature of the measures. At a high level, ensuring transparency in this area will help to ensure that information is available for rights holders to discern where and how their work has been used, and seek payment for the use of their work where appropriate.
New clause 18 seeks to address the widespread concerns of creative rights holders that they do not know when their material is being used, which acts as a block to any form of enforcement of their rights or licensing going forward. However, the approach to transparency must be proportionate. In particular, transparency obligations should not be so onerous that they stifle the market for AI start-ups and smaller enterprises in their infancy, when these entities should form the lifeblood of our future technological economy. One could envisage a situation where, if we get the legislation on this wrong, the recording and storing of information that has created a model, in its extreme triviality, could in effect become a bigger dataset than the set used in the first place— hence the focus on the importance of proportionate and effective measures.
The timing of the action to be taken by the Secretary of State under the new clause—within three months of Royal Assent—is intended to give the Secretary of State a proper opportunity to consider the feedback of the creative and AI industries and other stakeholders received under the recently concluded consultation on copyright and AI. I am ambitious for the Minister and the Government. I know he slightly deprecated his ability to get on and try to put something together, but I have faith in him to drive this forward. The feedback should help to formulate a plan that is both effective and workable, so I commend new clause 18 to the Committee.
New clause 19 requires the Secretary of State within three months of Royal Assent to lay before Parliament a plan to reduce barriers to market entry for start-ups and smaller AI enterprises, specifically in relation to access to data to train their models. The Government’s now concluded consultation on copyright and artificial intelligence specifies that one of its key aims is to support wide access to high-quality material to drive the development of leading AI models in the UK.
The AI opportunities action plan commissioned by the Government made a series of recommendations relating to the need to unlock public and private datasets to enable innovation by UK start-ups and researchers in order to attract international talent and capital. Although the imbalance in access to resources between established companies and start-ups and small and medium-sized enterprises is in the nature of markets, failing to address that in this context risks reinforcing a situation where AI development activity remains the preserve of a relatively small number of established operators. I am thinking particularly of those that have been able to set up now, using datasets that have not had to comply with whatever provisions will eventually be put in place to regulate this area.
The Prime Minister indicated in his response to the action plan that DSIT would look at how to take this policy area forward. New clause 19 calls on the Government to put their money where their mouth is, and come up with a proper plan to turbocharge not only growth in our AI sector, but diversity and competition among operators of all sizes, by coming up with a plan to reduce the barriers to market entry that the accessibility of data presents.
Given the increasing adoption and use of AI by digital Government and the mechanisms of the state, it is critical for our national security and our national resilience to have a functioning domestic AI market that we can draw on for our use. The timescale for the Secretary of State to publish his report provides an opportunity for industry and stakeholder feedback to be taken into account when developing effective plans. For those reasons, I commend new clause 19 to the Committee.
Finally, new clause 20 requires the publication of a technological standard. I hope that the previous three new clauses are relatively uncontroversial, and I particularly hope that this new clause is the most uncontroversial of all. It is based on an amendment that was moved by my noble Friend Viscount Camrose in the other place, but which was withdrawn on receipt of reassurances from the Minister. The new clause requires the Government to publish a technological standard for machine-readable watermarks for the purpose of identifying the licensed content when it is encountered by web crawlers and AI models.
The development of such a standard would greatly enhance the ability of rights holders to protect their work, and to support enforcement of creative rights by creating a record of where and how content has been used. The development of a technological standard will be incredibly useful, even if the Government decide to take forward the flawed opt-out proposals and even if they are not ultimately adopted. This technological standard would be used in whatever situation we find ourselves in over the next few years in this industry.
This proposal would provide web crawlers and AI models with clear signposting that the content is not available for training or generative activity. Where that is properly deployed and observed by AI models, it will reduce the need for rights holders to take action to seek compensation for infringement of their copyright. The new clause does not ask the Government to commit to a specific solution, and nor should it, but it requires them to commit to identifying such a solution within a reasonable period. Let the industry sort it out. One of the challenges that has been going through my mind when exploring this area is why, from a point of principle, the market has not fixed this in the first place, and why we are where we are now in trying to resolve these issues. The new clause will promote transparency and certainty for the creative and AI industries, and I commend it to the Committee.
I have found these discussions among Members across the House about AI and copyright to be a delightful exchange about the creative industries that we love. I understand the Minister’s comments about there being different spaces for that debate and that this may not be the place for that, but I would like to speak to the new clauses that were proposed in the other House.
To highlight one of the creative industries that I love, this weekend I went to The Rex cinema in Berkhamsted, one of the best cinemas in the world—if any Members want to come, they are all very welcome. Sitting there, watching “A Complete Unknown”— about Bob Dylan and his songs—in a local movie theatre reminded me just how important it is to ensure that the creative industries can maintain their creativity, because it speaks to our soul and our society. They are beyond machine learning. They are humans who actually understand us and help to tell our stories in different ways.
That is why the Liberal Democrats welcomed new clauses 135 to 139 that were tabled in the other place. We also welcome the consultation from the Government and are pleased to hear that there have been over 11,000 responses. Members will find responses from us in there, because we wanted to make sure that the Liberal Democrats made representations at every stage. It is disappointing that the change that the Government have set out means there is an “opt in by default” provision, and the message has come across from the creative industries that they are calling for an open consultation, so we can make sure that their views are heard. We support their voices being heard and, indeed, striking that balance where the creative industries can work with innovation, as they have been. This is about getting that right. We are proud of those industries in the UK. We must work together and truly listen to those thousands of voices up and down this country. That is what we stand for, and we will support the Government moving forward in that way.
First of all, I commend both hon. Members on their contributions. Let us face it; this is very difficult. It is a classic case of difficulty for the Government because, as the previous Government found, two sectors are really important to UK growth. The creative industries are one of the fastest-growing sectors in the UK and are representative of our future growth. Somebody put it to me the other day that there was a time when our economy relied on selling industrial products, then it was services, and, in the new era, perhaps one of the things that we are exceptionally good at is selling experiences, and a large part of that is through intellectual property from the creative industries.
As part of our industrial strategy, we have identified the creative industries as one of our key sectors and we definitely want to build on that. But, at the same time, we are one of the biggest markets in the world for AI. We have some of the best AI small start-ups and developers, and we want to build on that, not just as a key part of our economic growth, but as part of delivering productivity and efficiency within so much of our public services, within industry, and within so much of the way we live our lives.
As the hon. Member for Runnymede and Weybridge says, we do not believe that this is a zero-sum game; we believe that it is possible to get to a situation in which the AI companies can flourish in the UK, and the creative industries can do so at the same time. Indeed, I would argue that to say anything other than that would be, in the words of the Sugababes,
“a one-way ticket to a madman’s situation”.
I want to make a few points that I think are important here. First, on new clause 20, yes, we are very keen to get to a place where there are technical solutions. As I said when I introduced the consultation in December, there are some very clever people out there who know how to do some extraordinary things with IT, and there must be some very clever people who know how to ascertain the copyright of the material that they are using.
If we could get to a technical solution to rights reservations, so that AI companies have the security of knowing what material they are using—that they are scraping or ingesting—and creative industries know that their work is being used and can either say, “No, you can’t use it,” or, “If you are going to use it, you are going to remunerate me,” whether that is individually or as part of a collecting society, that would be a significant advance for us. That is one of the key things that the Secretary of State has repeatedly said we are trying to do. I am not sure that the precise way in which new clause 20 suggests we should bring forward a technological standard is the right way of doing it, but I am very happy to have further conversations with the hon. Member for Runnymede and Weybridge before we reach Report.
One thing that I think has been significantly difficult for everybody is trying to make a proper economic impact assessment of what each version of these outcomes would look like. How do you assess the value of future growth in AI in the UK against future growth in creative industries in the UK? It is very difficult to judge, and that is one of the things that I think that we need to address.
It is also true that many creative industries use AI all the time, either as part of their business processes or as part of the actual creative process, whether by editing a song so that it can use voices of people who are no longer with us, through CGI, by dubbing films, or through a whole series of different things. Indeed, I was really struck when I went to Stratford last week to see a magnificent production of “Edward II” at the Royal Shakespeare theatre. They are talking there about creating a new video game that is based on “Macbeth”, called “Lili”, and, of course, that will use a lot of artificial intelligence. A lot of tech companies are part of the creative industry, so I honestly do not think that these are two worlds that are completely apart.
I am tempted, because we both clearly share a love for the Sugababes, to just draw a reference to that. Of course, the creative industries also have quite a lot of experience of using other creative materials licensed from other producers. For the Sugababes song “Freak Like Me”, I am fairly sure that the sample is directly taken from Gary Numan’s “Are ‘Friends’ Electric?”, and presumably they came to some sort of arrangement in organising that. This sharing of data is not unusual for the creative sector.
I have often wondered whether Kylie’s song “Padam Padam” is a reference to Edith Piaf’s “Padam padam…” because they are also quite similar. This is, of course, a well traversed space for people working in the copyright sphere: trying to make sure that people are not passing off or using other people’s work without permission. The economic and moral rights of rights holders are well established in UK law. We started the process of legislating for that in 1709, under Queen Anne. Dickens had an ambition to make sure that people could not just take copies of his books to America and sell them for free without any payment to him, which is part of why we ended up with an international copyright regime. We do not want to undermine that.
Transparency is key, and it was one of the elements that we put into our consultation. We need to be careful about precisely how that transparency works out. The hon. Gentleman referred to proposed new clause 18, which he wants us to do something on within three months. As ambitious as he may be for me, that is a remarkably short period of time for parliamentary drafting. After all, a version of the Bill was first introduced four years ago, yet it still does not have Royal Assent. He refers to the terms “proportionate and effective” in that proposed new clause. To go back to “Hamlet”, aye, there’s the rub. What do proportionate and effective mean, and how do you balance the two? Different sectors will have difficult and competing sets of agendas and we may, in some shape or other, have to arbitrate.
The hon. Gentleman is right to say that no one, anywhere in the world, has come to a proper settled position on this issue. In the United States of America, for instance, a lot of AI companies have been relying on fair use, under the slightly different copyright system there, but the most recent federal case found in favour of Reuters against an AI company—the court said that scraping material was not fair use. The EU has not yet fully developed its proposals and has not decided how to implement its transparency requirements. It could be argued that that is one reason why there has not been much additional licensing, which was one of the EU’s declared goals.
There is also the question of how to enforce transparency. The clause—in a provision tabled by the admirable Baroness Kidron—specifically gives that enforcement power to the Information Commissioner. I am not sure that is the right place to put that power, or that the Information Commissioner has the expertise or resources for it. Undoubtedly, if there are to be new transparency requirements, we must have some form of enforcement.
All those things suggest to me that it is not right to put this power as an addendum to this Bill. The matter should be seen in the round, after a consultation, and there should be a proper primary legislation process, which may not happen for another 12 to 18 months, or even two years. I do not know.
The Secretaries of State for Culture, Media and Sport and for Science, Innovation and Technology, and myself, are keen to get to a place where there is more licensing of copyrighted material by AI companies. I do not think anybody expects that the labour of others should be handed over to third parties without recompense or control. That, in itself, is not simple. It might be simple for the Design & Artists Copyright Society to do all the licensing for artists and photographers, but how could that be done for people who works as individuals, rather than as part of a collecting society? That is another set of issues that need to be addressed.
I know there are places in the world where one can buy an AI-generated book, and it might be perfectly readable. We might all have authors who we think have worked in that way in the past. But when I read a book, watch a film, or listen to a piece of music, I want to know that it has a human involved in it. Human creativity is a vital part of what renders that process of creativity so important. I think of humans as fundamentally social beings. One of the worst things that can be done to a human being is to put them in solitary confinement, because that denies our fundamental social being. What is so special about all the creative industries is that they enable a connection from human being to human being. I am passionate about not losing that or the value inherent in that.
Amendment 22 agreed to.
Clause 95, as amended, ordered to stand part of the Bill.
Clauses 96 to 103 ordered to stand part of the Bill.
Schedule 10 agreed to.
Clauses 104 to 108 ordered to stand part of the Bill.
Schedule 11 agreed to.
Clause 109 to 112 ordered to stand part of the Bill.
Schedule 12 agreed to.
Clauses 113 to 115 ordered to stand part of the Bill.
Schedule 13 agreed to.
Clause 116 ordered to stand part of the Bill.
Ordered, That further consideration be now adjourned. —(Kate Dearden.)
Adjourned till this day at Two o’clock.