My Lords, I beg to move that the House do now again resolve itself into Committee on this Bill.
Moved accordingly, and, on Question, Motion agreed to.
House in Committee accordingly.
[The CHAIRMAN OF COMMITTEES in the Chair.]
Clause 61 [Disclosure of information to prevent fraud]:
[Amendment No. 104B not moved.]
105: Clause 61, page 33, line 18, after “order” insert “to be approved by affirmative resolution of each House of Parliament”
The noble Baroness said: I shall speak also to Amendments Nos. 106 to 108. These amendments aim to insert the affirmative procedure for orders that contain statutory instruments made under Clause 61 as they affect the specification of an anti-fraud organisation and the Secretary of State’s powers under Clause 62, as underlined and suggested by the report from the Delegated Powers and Regulatory Reform Committee.
It may be more convenient for the Committee if I deal with the coach before the horse and outline why we have tabled Amendment No. 106, since the Government appear to have conceded—the Minister is nodding—the point made by the DPRRC. The Government have tabled Amendments Nos. 129A and 132A, which fall later in the Bill. I know that the Government prefer to deal with such matters in that way. As the Committee will appreciate, the Opposition like to table amendments as they relate to the point of the Bill being debated in order that they are debated in the round of other issues. Part 3 is complicated enough without trying to dump these issues at the end of the Bill.
Paragraph 11 of the committee’s report sets out its concern, which we share. It states:
“Clause 62 creates a criminal offence relating to onward disclosure of that information. But the offence applies only to the disclosure of ‘protected information’. This is defined in clause 62(5) as meaning certain revenue and customs information specified in the bill or ‘any specified information disclosed by a specified public authority’, both specifications being by order made by the Secretary of State subject to negative procedure”.
The report goes on to say that,
“The definition of protected information is central to the working of the scheme”,
with which we agree. Whether or not information is protected by Clause 62 from onward disclosure may impact on the extent to which public authorities will share information under Clause 61. Broadening the definition will also extend the scope of the criminal offence. The committee therefore considers that orders under Clause 62 should be subject to the affirmative procedure. That is why we tabled our amendments, and the Minister will be able to comment on the Government’s amendments shortly when she replies to this group.
I now turn to Amendment No. 105 relating to Clause 61 and the alternative drafting tabled to Clause 64. The aim is to ensure that anti-fraud organisations specified by order are subject to the affirmative procedure. It is important to ensure that the anti-fraud organisations to be specified are kept within a narrow scope of operation that can be justified to this House and to another place.
The Minister has said that no decisions have yet been made on which organisation or organisations should be so specified, but that CIFAS, the United Kingdom’s Fraud Prevention Service, which is a non-profit-making organisation, provides a good example of the sort of body that the Government have in mind. It is unhelpful that the Government are not prepared to specify which anti-fraud organisations will fall within the schedule, so that we could scrutinise the list and challenge the inclusion or exclusion of organisations. It makes it very difficult for this House properly to scrutinise these provisions.
Can the Minister give the Committee further examples of which other bodies the Government may have in mind as potential anti-fraud organisations to be so specified? Does it include potential public-sector members which the Minister mentioned as having taken part in the pilot in the past? Perhaps the Minister will set out which might be involved and why. What criteria will an organisation be expected to meet in order that it may become so specified as an anti-fraud organisation? What criteria will it have to meet to establish a CIFAS-type arrangement? For example, will the Government allow designation of only those organisations which have a certain quality of data? Do they have to be a specific size or remit? How will those organisations be judged as being appropriate to come within the specified status?
One assumes that data sharing between CIFAS and CIFAS-like anti-fraud organisations in this legislation will be subject to review by the Information Commissioner. Will the Minister indicate whether that is the case? If so, what assessment have the Government made of the workload that will be required of the Information Commissioner? Members of the Committee may be aware that there have been trends for the Information Commissioner’s work to be substantially extended as a result of work by this Government. I realise that one or two of the amendments from this side of the House may so extend the commissioner’s work, but we would say that ours will be done in a perfectly proper manner.
Finally, would the Minister consider whether there should be an appropriate amendment to Section 70 of the Data Protection Act 1998? After the definition of “school”, why should one not insert the definition of “specified anti-fraud organisation” as having the same meaning as in Part 3, Chapter 1 of the Serious Crime Act 2007? That might be a helpful proposal to make the legislation clearer and help to cut down on some of the provisions in the Bill. I beg to move.
I do not want to repeat too much of what has already been said very compellingly by the noble Baroness, Lady Anelay. The definition of protected information is of course central to the working of the scheme. Whether or not information is protected by Clause 62 from disclosure is very likely to impact on the extent to which public authorities will share any information under Clause 61. Broadening that definition will also extend the scope of the criminal offence. So we also consider that orders under Clause 64 should be subject to affirmative procedure.
I, too, cannot understand why the Government have omitted the information on which anti-fraud organisations will be specified. I look forward to hearing further information from the Minister. I appreciate that we can return to this matter on Report. Nevertheless, before we finish debating this matter, I hope we shall know exactly which organisations will be specified, even if that is on Report. The noble Baroness is nodding, which I hope is encouraging.
I can see the great interest in this issue. I am very conscious that we will look at the framework under the clause-stand-part debate. It may be helpful at this stage for me to set the framework out so that we can see how the amendments fit. The noble Baroness, Lady Anelay, has foreshadowed what I am going to say in relation to Amendment No. 129A, but I will come to that in its place. We are all happily on the same side on that, but I come now to outline where we are at present.
To recap on the purposes of Clauses 61 to 65, which fall into two distinct camps; that is, first, Clauses 61 to 64, which deal with data-sharing, and, secondly, Clause 65, which deals with data matching. Clauses 61 to 64 provide a legal gateway for the public sector to share information with each other and the private sector in order to prevent fraud. This sharing will be done through a specified anti-fraud organisation.
Clause 61 provides a new legal gateway for those public authorities which need it to allow them to disclose information to a specified anti-fraud organisation for the purpose of preventing fraud. Clauses 62 and 63 make it an offence and provide a penalty for the onward disclosure of HMRC data in certain prescribed circumstances. So Clause 64 amends the Data Protection Act to allow for sensitive personal data to be processed for the purpose of the prevention and detection of fraud. However, this does not lift any of the other data protection requirements such as lawful or fair processing. We will return to this when we look in more detail at the clause.
Clause 65 and Schedule 6 amend the Audit Commission Act 1998 to grant the Audit Commission-run exercise, the National Fraud Initiative, statutory powers under which to carry out data-matching exercises. At present, auditors match data only from bodies which are subject to audit, principally local government and health service bodies. New Section 32C would allow other bodies to provide data to the Audit Commission for matching as well, and would lift the statutory bars which might otherwise prevent this happening. This would allow other bodies in both the public and the private sectors to participate in the National Fraud Initiative provided—this is an important proviso—that the Audit Commission thinks it appropriate.
At present, data matching is only undertaken to combat fraud, and new Section 32G allows the Secretary of State by order to add to this purpose. The section also allows the Secretary of State by order to add to the bodies which the Audit Commission will be able to insist provide data for data matching. It is accepted that the private sector should be included only on a voluntary basis. The new provisions will also allow disclosure across the United Kingdom on a cross-border basis with the relevant Auditor-General if they choose to undertake similar data-matching exercises.
I am grateful to the Minister for giving way. She has referred twice to the private sector joining in with information sharing on a voluntary basis. Can she tell the Committee whether there is a possibility that the personalised rather than anonymised information relating to anyone suspected of crime will be provided to the private sector?
The noble Lord will know that the Audit Commission can identify areas which can then be offered to bodies to scrutinise to see whether there is any irregularity. That is how the data-matching initiative works. It is not a means by which illicit activity is specifically identified. It simply identifies areas which then would deserve scrutiny by the appropriate body. So working on the debate we had on the previous occasion this arose, it is important to get the right people to look at whether there is improper behaviour. However, that is not the function of the initial matching.
As I was saying, these new provisions will allow cross-border activity with the Auditor-General as it is deemed appropriate.
The noble Baroness has foreshadowed that I am going to resist her amendments to the Bill. Amendment No. 106 would achieve the same effect as government Amendments Nos. 129A and 132A which we will come to in their place. I understand why the noble Baroness is moving her amendments now. It is easier to deal with them at this point, although logistically the amendments put forward by the Government fit the Bill a little better. I shall therefore speak first to Amendment No. 105. This amendment suggests changing the order-making power provided in Clause 61 from the negative to the affirmative resolution procedure. The specification of the anti-fraud organisation is inherent to the entire section. Without it, the data sharing we propose to enable for the purpose of detecting and preventing fraud could not take place.
I would like to reassure the noble Baroness that any body chosen will not be chosen lightly. We will have a process by which we can scrutinise the functions of the body and how it will operate. Indeed, one of the issues for us now is that we think we need greater flexibility than the rigid prescription currently provides, and we have therefore identified the sort of agency we have in mind in terms of CIFAS. The issue will be looked at with the greatest of care. Further, the requirement of this body and any other bodies sharing information through the designated anti-fraud organisation to comply with the Data Protection Act necessarily limits the scope of the sharing that can take place. Complying with the Data Protection Act automatically attracts the oversight of the Information Commissioner. This is the fundamental safeguard for the operation of the specified anti-fraud organisation, and it is hard to see what additional safeguard would be brought in by making its specification subject to affirmative resolution. The Select Committee on Delegated Powers and Regulatory Reform has also examined this clause and declared that it represents the appropriate level of parliamentary scrutiny, and we respectfully agree. In the circumstances, I hope that the noble Baroness will feel that she has probed sufficiently in this regard.
I turn now to Amendment No. 106, which suggests an amendment to the order-making power provided in Clause 62 allowing information other than that of HMRC to have the added protection of the offences provided for in this clause. The noble Baroness referred to the fact that the Delegated Powers and Regulatory Reform Committee, in its fifth report of the Session, felt much as she does about these order-making powers. I am grateful to her for refreshing our memory of the basis on which the committee came to its view. It felt that the definition of protected information is central to the working of the scheme and went on to say that whether or not the information is protected by Clause 62 from onward disclosure, it may impact on the extent to which public authorities will share that information. It also saw the broadening of the definition as extending the scope of the criminal offence. The committee therefore felt that such an order should be subject to the affirmative resolution procedure.
I know that the noble Baroness will be delighted to hear that we are persuaded by these arguments, and it was on that basis that we tabled government Amendments Nos. 129A and 132A. These will give effect to the wishes expressed both by the committee and the noble Baroness. Together with her noble friend Lord Henley, the noble Baroness proposes that the anti-fraud organisation referred to in Clause 64 should be designated by order and subject to the affirmative resolution of each House of Parliament. It might be helpful if I explain in a little more detail Schedule 3 to the Data Protection Act and the proposed amendment to it made in Clause 64. Schedule 3 sets out the:
“Conditions relevant for the purposes of the first principle: [the data protection principle] processing of sensitive personal data”.
It provides a list of circumstances, one of which must be fulfilled, if the data is to be regarded as being processed fairly and lawfully.
Clause 61 provides a gateway for the sharing of information through an anti-fraud organisation for the prevention of fraud. However, not all public authorities will use the power conferred by Clause 61. There will be some which do not require recourse to it. As a consequence, Clause 64 has deliberately been designed to be “at large” to ensure that any sharing of sensitive personal information for the purposes of preventing fraud is lawful under the Data Protection Act. If this clause were limited to only a specified anti-fraud organisation, it would not cover that information being shared outside the powers provided by Clause 61. Having listened carefully to the noble Baroness, both during this debate and in the past, I know she would wish to have any such sharing of personal data done in a way that would be consistent with the Data Protection Act and indeed be entitled to the proper scrutiny that Act creates.
The noble Baroness made an interesting suggestion about amending Section 70 of the Data Protection Act to include a definition of “specified anti-fraud organisations”. There would be little point in adding such a definition as the expression does not occur in the Act, but I understand the reasons why she makes that suggestion.
I hope I have been able to reassure the Committee that the process we have adopted has been fair. I also reassure the House that the Information Commissioner’s formal response to the consultation paper endorsed the approach taken to improving data sharing. We have since had discussions with the commissioner’s office and will continue to involve it in the development of these policies. Indeed, my honourable friend Vernon Coaker in another place met the Information Commissioner on 25 January to discuss the Bill in detail, and the commissioner will be consulted on any codes of practice produced to guide these proposals. I hope that reassures the noble Baroness. We do not believe we are overburdening him in this regard, and neither do we criticise her for highlighting those areas where his support might be helpful.
I am grateful to the Minister for the care she has taken in setting out the framework of this part of the Bill and for setting it against the background of Clause 64 stand part. While she was speaking, I was reconsidering whether I would want to object to Clause 64 standing part. I will not do so at this stage. That does not mean I am happy with the clause, but the Minister has properly addressed the questions I was going to ask.
Throughout all of this, I and other noble Lords on this side of the Committee have been extremely concerned that the Government are in danger of enabling the further exchange of personal information where that is not appropriate. The Minister is trying to reassure us that this cannot be the place for that to happen and that many mechanisms will be put in place. She refers to a criminal offence being created. I was intrigued, by the way, when I was sitting waiting for this Bill to come on—we always seem to start late in the evening—and listening to her noble friend Lord Davies of Oldham wind up on the Government’s Statistics and Registration Service Bill. He also referred to the creation of a criminal offence that would, one hopes, tackle those who wrongly reveal information. The trouble with that is that it is closing the stable door after the horse has bolted. With our amendments we are trying to persuade the Government to ensure that personal information does not go beyond appropriate exchange.
The Minister says that one has to look at the role of the Audit Commission and the strict control it has. In talking to the Audit Commission before Second Reading, I was impressed with its approach. It is one of the most reputable organisations in this country or elsewhere. The commission’s argument is, as the Minister proposes, that it identifies what may be information that is out of the ordinary. A data match might throw up some activity that might lead to concern, which is then for someone else to deal with. The commission does not say, “This is the conclusion we draw from that information”; instead, it says to the public authority, “Have a look at this”, and leaves it to that authority. It says there is no leakage of information—but we have suspicions about the way information is transmitted between organisations, and that might not be the case. In addition, because of the way Clause 64 operates in conjunction with the Data Protection Act, there is a concern that some personal information might be disclosed that we would say should not.
The Minister tried to give a general answer to some very specific questions, but in an appropriate way—I do not think it was out of place on this debate. On occasion when she was being general, I did not find it satisfactory. For example, when dealing with my Amendment No. 105, she said that an organisation that is to be a specified anti-fraud organisation will not be “chosen lightly”. Well, I hope not. Am I supposed to say “Wonderful!”? If the Government chose such an organisation lightly, where would we be? She said the Government will have “a process” for deciding it. Good gracious me, which Government in this country should not? Finally, she says—again we come back to it—that the Government want greater flexibility than will be provided by having a list or examples in the Bill. Again the Government want flexibility while Parliament tries to exert scrutiny.
The only crumb of comfort we have been given through all this is the Government’s proposition that it will be a CIFAS-like organisation—but that does not tell us what might happen in the future. So I am still not happy with the amount of proper accountability we have achieved—in fact, I do not think we have achieved any. I will need to look at this again before Report. Some of these amendments will not return, such as, obviously, those that are shadowed by the Government’s. I shall not re-table an amendment to leave out Clause 64; I will try to deal with this in a more targeted way. I beg leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Clause 61 agreed to.
Clause 62 [Offence for further disclosures of information]:
[Amendment No. 106 not moved.]
Clauses 62 agreed to.
Clause 63 agreed to.
Clause 64 [Data protection rules]:
[Amendments Nos. 106A to 108 not moved.]
Clause 64 agreed to.
109: After Clause 64 , insert the following new Clause—
“Register of information disclosure
(1) A specified anti-fraud organisation within the terms of section 61(8), and the Audit Commission in respect of matters referred to in section 65, must notify the Information Commissioner promptly on each occasion when information is disclosed to it by a public authority or other person, unless otherwise directed by the Information Commissioner.
(2) A notification under subsection (1) must include—
(a) the name of the notifying authority; (b) the name of the person disclosing the information; (c) the nature and quantity of the information disclosed; and (d) the reason for disclosure. (3) A notifying authority must comply with any directions made by the Information Commissioner as to the content of a notification, or as to such further information regarding the information and the use to which it has been put as the Information Commissioner may require.
(4) The Information Commissioner may issue such directions for the purposes of subsections (1) and (3) as he deems reasonable.
(5) The Information Commissioner shall maintain a register of all notifications received, and may publish it, or parts of it, and may conduct such investigations and publish such reports as he deems reasonable.”
The noble Baroness said: I do many things, but I do not normally masquerade as my noble friend Lord Lucas. I was looking to see whether or not he had managed to be here. My noble friend regrets that due to a longstanding engagement—this is what his note says—he is unable to be present today. He has asked me to move Amendment No. 109 and speak to Amendment No. 111 on his behalf. I shall amalgamate my own comments with those of my noble friend so as not to detain the House too long, and so that my noble friend Lord Northesk can get to his amendment very shortly.
My noble friend states that the objective of the amendments is to ensure that a proper record is kept of the use of these powers so that we may make considered judgments on them in future. Amendment No. 109 would insert provisions into the Bill for the creation of a register of information disclosure. Amendment No. 111 would ensure that, prior to conducting each data-matching exercise under new Section 32A in Schedule 6, the Audit Commission must produce a report on that exercise.
I shall return to my own comments. We on these Benches believe that these amendments raise, rightly, oversight of the data-matching regime. I hope the Minister will be able to explain exactly what regime there is and how often it will operate. We all recognise that the Information Commissioner’s office increasingly has to deal with more demands upon his time, so it is essential that when we make such demands they are justified. We believe it is right that the data-matching exercise should be seen to be both transparent and accountable. Oversight brings advantages, of course; it would help to prevent and monitor the potential function creep that commentators outside this House are concerned may occur.
Amendment No. 111 would also focus the Audit Commission’s minds on whether the data matching it wants to process is justified and proportionate in the way that the Government insist is intended. We have to be wary, however, of any overburdensome, overbureaucratic or costly checking measures. Any monitoring that is carried out has to be done in an efficient and effective manner.
I would be grateful if, in responding to my noble friend’s amendment, the noble Baroness indicated what the triggers would be to justify the decision to embark on the data-matching exercise. Will these be decided by the Audit Commission or by the anti-fraud organisations? How will they be communicated to the public or even the data subjects? Will the commission announce on its website that it is looking into new areas? These are some of the questions that my noble friend wished to be put to the Minister.
It will be important for accountability to show that the data-matching is sustained by proportionate use of powers. We have said throughout that public trust is vital if we are to expand the use of personal data. I agree with my noble friend that it is important to address what could be the unintended consequences of Part 3—they have not yet been addressed by the Government. It is in the spirit of trying to encourage the Government to look more carefully at what could be the impact of the provisions that the noble Lord has tabled the amendment. I beg to move.
We regret that the noble Lord, Lord Lucas, was not in a position to move the amendment himself, but the noble Baroness has done so with such elegance that I am sure he will be very happy that she was here to act in his stead.
Under the amendment, the anti-fraud organisation and the Audit Commission would have to notify the Information Commissioner each time they received information from a public authority, unless the Information Commissioner directed otherwise. Such notifications must include the name of the notifying body, the name of the person disclosing the information, the nature and quantity of the information and the reason for disclosure. Although I know that this is a probing amendment, it would also place further duties on notifying bodies to comply with any direction that the Information Commissioner made, and on the Information Commissioner to maintain a register of all notifications received.
The data that the anti-fraud organisations will share and the Audit Commission will use for its national fraud initiative will, as I said, be “personal data” for the purposes of the Data Protection Act 1998. Part 3 of that Act already prescribes a comprehensive registration regime for those bodies that process personal data and specifically prohibits them from undertaking this activity unless they comply with those obligations. It is perhaps important that we remember that “processing” in a Data Protection Act sense encompasses both receiving data and providing it, so that it will capture anti-fraud organisations, the Audit Commission and any other body that provides them with information—in other words, the bodies that are contemplated by this amendment. The data controllers who are processing personal data must notify the Information Commissioner of their particulars including, among other things, their names and addresses, a description of the personal data to be processed, the purposes of the processing and a description of the recipients to whom such data may be disclosed. I understand what the noble Baroness is seeking in her amendment, but I respectfully suggest that it is difficult to see how it would add anything of value to the registration system that is already in place.
Further, the Information Commissioner already has a comprehensive suite of powers in relation to the bodies that will be sharing or matching data and the work that they do. He can assess whether they are complying with any of the duties under the Data Protection Act. He can issue notices requiring these bodies to furnish him with such information as he may specify for his purpose. He also has enforcement powers to rectify instances of non-compliance and is able to prosecute individuals for offences under the Act.
The very nature of fraud means that the bodies that are trying to tackle it need to be able to move quickly. This process could be hampered by an amendment such as this. The noble Baroness herself made it clear that she did not want to overlay unnecessary bureaucracy or anything of that sort. There would also be significant resource implications for the Information Commissioner in so far as how he would be expected to manage and process an influx of notifications.
The noble Lord, Lord Lucas, suggests in his Amendment No. 111 that the Audit Commission should be placed under a duty to produce a detailed and prescriptive report before it undertakes each data-matching exercise. This amendment is not only unnecessary but bureaucratic and unworkable. Under it, the Audit Commission’s report would first have to state the reasons for conducting the data-matching exercise. That is unnecessary as the reasons for data matching are already set out in Section 32A. At that stage, data matching would be carried out to assist in the prevention and detection of crime. The amendment would then require the report to detail any assumptions to be made in the data-matching exercise, the audit to which those assumptions would be subject, and the outcome of the data matching which the Audit Commission would consider as successful. It is unclear what the noble Lord means by “assumptions”, but in any event the Audit Commission makes no assumptions about whether any person whose data is shown to match may or may not be guilty of fraud. It will simply match data provided to it and forward any anomalous matches thrown up to the relevant bodies and their auditors for further investigation.
The amendment proposes that the Audit Commission’s report would also have to detail why it considered that the data-matching exercise would be a proportionate use of its powers and the steps that it would take to ensure that data subjects are protected. Those, too, are unnecessary requirements. First, the Audit Commission is obliged to comply with the Data Protection Act and the European Convention on Human Rights regardless of the circumstances; secondly, the code of practice, which the Audit Commission is required to promulgate, is the more appropriate place for addressing issues of this nature.
It may reassure the Committee to know that the Audit Commission undertakes pilot exercises to ensure that any wholly new data matches that are incorporated into the national fraud initiative will be likely to yield a high incidence of anomalies, or anomalies which, if found to be fraudulent, could involve large sums of money. It also has detailed security standards which apply to its data-matching exercises and to participating bodies. All these issues are already provided for in the Audit Commission’s existing code of data-matching practice. For those reasons, I must resist the amendments.
The noble Baroness asked what the triggers were to justify data sharing—for example, from CIFAS rather than the Audit Commission. Although we cannot give a definite answer without reference to a specified anti-fraud organisation, CIFAS insists that, before filing a report for other members to check against, members must have enough evidence of fraud to make a report to the police. We think that that practice has proved sound and successful.
The noble Baroness also asked how the commission chooses the data sets and whether those data sets would be published. I hope that I have covered that in the answer that I gave. There is clarity in how that is done. It has worked well, and we believe that it is a sound premise on which to go forward. I hope that, with that explanation, the noble Baroness will feel able to withdraw the amendment.
My noble friend Lord Lucas will of course read carefully the explanation given by the Minister. However, the Minister will be aware that it is not only my noble friend who was put at a disadvantage by not being able to attend today; other noble Lords were in the same position. We are very grateful that the Government Whips Office found extra time for the Bill tonight at somewhat short notice. Noble Lords will recall that we had two late starts and lost more than two hours of debate when we were engaged in another matter, on deciding the future of this House—or not, as the case may be. That means that noble Lords were unexpectedly told quite late that we would be dealing with the Bill today, and I know that it is disappointing for some who have taken an active part in the Bill so far not to be able to be here.
The Minister proposed two main arguments against my noble friend: first, that a registration system is already in place; and, secondly, that my noble friend’s solution to the problem he perceives is potentially bureaucratic and has unwelcome resource implications for the Information Commissioner. That is exactly why my own name was not put to the amendment—because I appreciated that point about the costs.
My noble friend anticipated that the Minister might take that view and has a riposte. Amendment No. 109 might seem to impose a huge requirement for record-keeping but it should not. Under subsection (4) of the amendment, the Information Commissioner may issue directions, and my noble friend would expect him to use that power to limit the obligations imposed to reasonable levels. The key is that that should be the Information Commissioner’s decision. However, my noble friend accepts with a succinct comment at the end of his notes that Amendment No. 111 is rather tougher to argue in that regard. I am sure that he will consider carefully before he decides whether to bring the matter back on Report. I beg leave to withdraw the amendment.
Amendment, by leave, withdrawn.
110: After Clause 64 , insert the following new Clause—
“Functions of the Secretary of State as to sharing of information
(1) The Secretary of State has the following specific functions in respect of the sharing of information—
(a) to draw up and disseminate to the public bodies and other organisations to whom this section applies guidance as to the sharing of information between and amongst themselves; (b) to draw up and disseminate to the public bodies and other organisations to whom this section applies guidance as to the circumstances in which it is appropriate for those organisations to share information between and amongst themselves; (c) to maintain under review the guidance set out in paragraphs (a) and (b). (2) In drawing up the guidance set out at subsection (1)(a) and (b), and in reviewing such guidance under subsection (1)(c), the Secretary of State shall consult with the Information Commissioner.
(3) The guidance under subsection (1)(a) and (b) shall in particular, but not exclusively, make provision—
(a) as to the nature of the information that may be shared; (b) as to procedures designed to ensure the accuracy and security of information shared; (c) as to procedures designed to ensure, where appropriate, the co-ordination of the sharing of information between and amongst the public bodies and other organisations; (d) as to procedures designed to govern the circumstances in which information can be lawfully shared notwithstanding any rule of law which prohibits or restricts the disclosure of information; (e) as to procedures designed for circumstances where, notwithstanding the second data protection principle, data is intended to be lawfully shared or processed beyond the purpose of its original collection; (f) as to procedures designed to guarantee, as appropriate, the rights of data subjects in respect of any information about them that may be shared; (g) as to procedures designed to govern the period for which it is appropriate that information should be shared and to ensure appropriate deletion of any information shared. (4) This section applies to public authorities and any anti-fraud organisations specified under section 61 and any agencies, companies or individuals who may be contracted to work for them or to supply goods and services to them.
(5) The information for which provision is made under this section includes all information disclosed under section 61 above.
(6) The Secretary of State may by regulations subject to affirmative resolutions in each House of Parliament, proscribe and penalise contravention of any guidance under this section as to the collection, sharing, use, holding and disclosure of information.”
The noble Earl said: The amendment’s purpose is straightforward—that the Secretary of State draw up a statutory code of practice in respect of the Bill’s information disclosure provisions with the intention that it be enforceable in law. In so doing, it goes rather wider than my noble friend Lady Anelay’s proposition in Amendment No. 103, which we debated last week. As has already been observed, there is a sense of déjà vu here. Indeed, I suspect that the Minister finds my persistence with this somewhat tiresome. Nevertheless, I begin by agreeing with my noble friend Lord Lucas in his observation at Second Reading that there is,
“no good argument before the event for preventing these sorts of activities”.—[Official Report, 7/2/07; col. 746.]
By that he means to say, information sharing. I acknowledge absolutely that such IT processes can be of considerable utility in combating fraud. What matters, therefore, is that any such regime must be properly accountable and transparent.
The Government argue that adequate accountability and transparency is afforded by the terms of the Data Protection and Human Rights Acts. However, the noble Lord, Lord Thomas of Gresford, clearly disagrees. At Second Reading he suggested:
“The Data Protection Act is given lip service in the Bill and is then circumvented … Confidentiality is overridden, the Data Protection Act is overridden, no general code is proposed to govern the arrangements and the circumstances in which the disclosure is to be made are not to be limited in any way”.—[Official Report, 7/2/07; col. 739.]
Clearly there are manifest difficulties of interpretation here, not only in terms of the extent to which the DPA may or may not apply to the provisions but also in terms of whether the powers being sought are proportionate. The Minister graciously hinted as much in her recognition at Second Reading that the Government must ensure that,
“the arrangements are transparent and command public confidence, are proportionate and are subject to periodic review”.—[Official Report, 7/2/07; col. 732.]
For my part I accept that, as the Minister suggested last week, our respective positions are not that far apart. I also acknowledge that, on strict interpretation, the Government could merely rely on codes of practice from the Information Commissioner, the Audit Commission and so on as appropriate safeguards. Indeed, in respect of the generality of data protection, I acceded, albeit with some reservations, to this flexibility when the House scrutinised the DPA nearly 10 years ago. This defines a major part of the problem. For all the technological neutrality of the DPA, IT has developed exponentially since its enactment. Its processes are so much more pervasive and powerful, particularly within government, than any of us anticipated as we wrestled with these issues in 1998. At least in part, Lord Williams of Mostyn, whose wise counsel in this and many other matters we all miss, recognised this in suggesting:
“If one has the possibility of data matching, one looks at that with a degree of anxiety”.—[Official Report, 25/2/1998; col. CWH129.]
Today that anxiety is given particularly sharp focus as a result of the Information Commissioner’s recognition last November that his fears that the UK would,
“sleep-walk into a surveillance society”,
have become a reality. This reality encompasses a situation where, according to a report commissioned by the Information Commissioner:
“Most profoundly, all of today’s surveillance processes and practices bespeak a world where we know we’re not really trusted”.
As he himself has stated, this begs the question where the line between increased surveillance and appropriate safeguards should be drawn. Moreover, the debate about this, and any decisions that may result, should properly belong to wider society, including Parliament, rather than residing exclusively in the hands of the Executive. Consequently, there is a wholly legitimate case for arguing that the provisions of this Bill should be subject to a greater level of accountability and transparency than that proposed by the Government.
Indeed, this becomes especially important in the context of the Minister’s observation at Second Reading that:
“The data-sharing provisions in the Bill are … very much about providing the mechanisms. They do not go to the nature of the data sharing itself. That is for later, at the implementation stage”.—[Official Report, 7/2/07; col. 732.]
In effect, the Home Office deems the practical operation of the Bill’s provisions to be somewhat outwith its drafting. By implication there is a question mark in the Government’s own mind about the Bill’s compliance with the DPA. Surely, it is more appropriate to ensure that the source legislation—this Bill—guarantees that the data-sharing regime envisaged is fully compliant.
The Government should recognise that there is an over-arching and more fundamental reason why the amendment, or something like it, is desirable. For manifestly obvious reasons the Government are the most extensive collector and holder of data about the individual. This fact imposes a heavy and inescapable responsibility on the state, especially given the way in which justifiable concern about the encroachment of the “database state” has grown substantially in recent years. There is increasing scepticism and distrust of the Government’s capacity to administer and manage our individual data proportionately and fairly. The recent question about the collection of children’s biometric data and the decision of the Home Affairs Select Committee of another place to investigate the “surveillance society” are illustrations of the point.
Bizarrely, therefore, just when there is an urgent requirement to strengthen data protection, the Government appear to be weakening it substantially. Indeed, if trust and confidence in our political process are to be reinvigorated, the state is under an obligation to ensure that data management regimes within the public sector are as robust as possible—in fact, even more robust than is the case generally.
Evidently, the proposed new clause is aimed at fulfilling this obligation. But more than this, it also seeks to offer the “general code” called for by the noble Lord, Lord Thomas of Gresford, as well as, in line with the wishes of the Minister,
“to ensure that the arrangements are transparent and command public confidence, are proportionate and are subject to periodic review”.—[Official Report, 7/2/07; col. 732.]
The report commissioned by the Information Commissioner, to which I referred, argues that—and this is a very significant sentence from that report:
“Social relationships depend on trust and permitting ourselves to undermine [data protection] seems like slow social suicide”.
I am certain that the Government are antipathetic to that prospect. I beg to move.
I strongly support my noble friend’s amendment. As he said, it follows on very properly from the debates that we had on Amendment No. 103 on the code of practice for sharing data.
The amendment would give practical clarity to the provisions governing the disclosure of information, against a background of reminding us of some of the concerns in this country about how information may or may not be disclosed. My noble friend was absolutely right to draw our attention to the fact that IT has developed exponentially since the passage of the 1998 Act. We have to look carefully at whether there is now a further need to strengthen data protection just at a time when the Government appear, through the Bill, to be trying to weaken it.
Like my noble friend, I was struck by the fact that we are having these debates against the background of the decision in the past few days of the Home Affairs Select Committee in another place to carry out a thorough investigation into what has been colloquially called the “surveillance society”. It will be looking specifically at data protection and data exchange as part of that investigation. Obviously, another place will benefit from that report when this Bill reaches it; we shall not, which is a disadvantage to noble Lords in considering the Bill.
My noble friend has argued his case with devastating logic. Of course we all agree, as we have said before, that there are legitimate arguments in favour of the utility of databases, and there are arguments that could strengthen the security of people in this country. Yet again, as my noble friend has made clear, it is essential that where there are links between different data sources or an increased use of data matching, we are sure that that is subject to very clear parliamentary debate and accountability.
I hope that the amendment will be accepted. If the Government say that the drafting is not perfect at this stage, I hope they will accept it in principle and work with my noble friend in trying to achieve a drafting that would be able to go on to the statute book. The amendment, and the principle behind it, will not easily go away.
I say to the noble Earl, Lord Northesk, that he is by no means tiresome in this regard. He and I both know that this is an important issue which we have to scrutinise. He is absolutely right when he says that, in terms of the end result that we wish to achieve, there is very little between us, if anything. I hope that I will be able to persuade him that the way in which the Bill has been structured in no way weakens the substantial protection provided by the Data Protection Act and by the Human Rights Act. Although I will resist the amendment, I hope that I will be able to persuade the noble Earl that we have unanimity of view.
The noble Earl asks, in the amendment, for the creation of a provision for the Secretary of State to produce and disseminate guidance to all those using the data-sharing powers under Clause 61. I absolutely understand the scope that he seeks to explore, because he asks that the guidance be disseminated to all those involved in data sharing and that it cover the type of sharing of information that can take place between and among those involved and the circumstances in which that sharing can take place. The amendment provides that the guidance should be maintained under review and that the Information Commissioner should be consulted on the content of the guidance.
The amendment also requires that the guidance should cover the procedures designed to ensure accuracy and security of the information being shared under the powers, the procedures to ensure co-ordination between bodies and agencies sharing information, the procedures that govern the circumstances in which information can be shared and the procedures designed for circumstances where, notwithstanding the second data protection principle, data are intended to be lawfully shared or processed beyond the purpose of their original collection.
The final requirements of that part of the amendment cover the procedures that would guarantee the rights of the data subject and the procedures governing the period of retention of data and how such data will be disposed of. I reassure the noble Earl that we have looked very carefully at those issues. We understand the way in which he puts them, but we question the need for that degree of prescription. The Data Protection Act provides the regulatory framework for data sharing and I would be reluctant to introduce a further layer of regulation that would seem, I respectfully suggest to him, to add little to the general regime. Nothing in these clauses authorises disclosures that contravene the Data Protection Act.
At a practical level, I question whether it would be possible to draw up guidance that could sensibly be applied to all the circumstances and types of information that public authorities may wish to share in order to prevent fraud. There would be the difficulty of the guidance applying only to those public authorities that use the power provided by Clause 61—many, as we discussed earlier, would not do so.
It may be helpful, since we have not specified the organisation yet, if I use CIFAS as an example of how the system might work. CIFAS is the UK’s fraud prevention service. This is an organisation of the sort that the Government have in mind for the purpose. CIFAS currently has more than 250 members from the financial services sector. Members of CIFAS put on to a central secure database information on those who have attempted to defraud them. Members can then check new applicants against this central database and, should a match occur against this applicant, further investigation will be carried out before a service is granted. Automatic rejection on the basis of a match is not permitted by the rules of membership. CIFAS is a data controller for the purposes of the Data Protection Act and has its own rules, which conform with the Act. We argue that that is a much better way of addressing the issues that the noble Earl’s amendment seeks to address.
Central to the reason for resisting this clause is that none of the data sharing enabled by Clause 61 can be done without complying with the Data Protection Act. On this basis, the amendment is, I respectfully and gently suggest, unnecessary.
We have given a clear exposition of the reasons why we think that data sharing is good; indeed, the noble Earl concurred with that view. We need safeguards; I concur with him. Those safeguards are properly set out in the Data Protection Act. I also say to the noble Earl that if it was part of the effect of this Bill that we were in any way undermining the safeguards provided by the Data Protection Act, making it more difficult to apply or disapplying it, I would feel the level of discomfort that he obviously feels.
The whole premise on which the Bill is created is that the Data Protection Act remains, with its full bite, and that the commissioner has the same duty and responsibility in relation to these acts—and omissions—as he has in relation to any other data. That is a very powerful tool to prevent us from sleepwalking into the surveillance society that the noble Earl presented us with. I, too, remember the sagacity of Lord Williams; he is greatly missed by this House. However, on this occasion, it is likely that his view would rest with mine.
I am grateful to the Minister for her reply; not surprisingly, I am even more grateful to my noble friend Lady Anelay for her contribution. I shall state something that is obvious and which I sought to draw out when I moved the amendment. Part of my complaint is not necessarily that the DPA does not apply properly in respect of these provisions; the fact is that the DPA is 10 years old. Ten years on, the whole IT world is a completely different kettle of fish. Part of my problem is whether the DPA of itself is robust enough to deal with this sort of provision in the current circumstances. I shall not labour the points further as it is late. I shall withdraw the amendment but, if only at the urging of my noble friend Lady Anelay, I have absolutely no doubt that we shall return to this matter on Report. For the moment, I beg leave to withdraw the amendment.
Amendment, by leave, withdrawn.
Clause 65 agreed to.
110A: After Clause 65, insert the following new Clause—
“Sharing of information and data matching: assessable processing
Section 22 of the Data Protection Act 1998 (c. 29) (preliminary assessment by Commissioner) has effect in respect of the sharing of information and data matching for which provision is made under this Part.”
The noble Baroness said: The amendment inserts a new clause after Clause 65 to enable the sharing of information and data matching under Part 3 of the Bill to be subject to assessable processing under Section 22 of the Data Protection Act 1998. My noble friend Lord Northesk, who is no longer in his place, is concerned that, 10 years on, the Data Protection Act may not be as robust as it should be, whereas the Minister has just said that the DPA remains with its full bite. As far as the Government are concerned, the DPA is robust and able to deal with matters.
Section 22 enables the Secretary of State to designate particular kinds of processing that appear to be likely to cause substantial damage or distress to the data subject or otherwise significantly prejudice the rights and freedoms of data subjects as assessable processing. Proposals for that have to be submitted to the Information Commissioner before the processing can go ahead. As I understand it, the commissioner cannot forbid the processing, but he can issue a formal opinion to the controller and, presumably, if the controller does not take notice, after the processing starts he can take action to restrain the disapproved-of processing. That is not good grammar but the Committee will understand what I mean.
Can the Minister confirm that no types of processing have ever been designated and, if so, why has none been so designated? Would it not be possible for the Secretary of State to designate both the disclosures to the private sector and the data-matching exercises as assessable processing, so that the commissioner could review the detailed plans before they go ahead? That would take advantage of an existing system and provision, which is surely exactly what Section 22 was intended to do.
Interestingly enough, although there has apparently been no designation, I am advised that the activities of inquiry agents have proved to be a major problem. The Information Commissioner’s Office has published a report, What Price Privacy? The Unlawful Trade in Confidential Personal Information, which details the problems. One major example included records of information supplied to 305 named journalists working for a range of newspapers. The Government have accepted a recommendation of that report and I understand are now committed to imposing a prison sentence for offences of obtaining data unlawfully under Section 55 of the Data Protection Act.
In response to the consultation paper, Increasing Penalties for Deliberate and Wilful Misuse of Personal Data, the Government stated that they are strongly committed to ensuring that there is robust protection for personal data. If that is the case, why did the Government not take up the option to designate activities of inquiry agents under Section 22, as they originally intended? While I agree that, as I have mentioned before, tougher sentencing can be a deterrent, it is simply closing the stable door after the horse has bolted. We need to be proactive, in the first place, in protecting our personal data. I beg to move.
The noble Baroness is right that under Section 22 of the Data Protection Act the Secretary of State can make an order designating a type of data processing as assessable processing, if it appears to him that it is likely—this is the important point—to cause substantial damage or distress or otherwise significantly prejudice the rights and freedoms of the data subjects. That is the benchmark.
An order under Section 22 of the Data Protection Act would add an extra stage to the notification regime already in place under that Act. Data controllers, such as the Audit Commission, are required to notify the Information Commissioner on how they intend to process personal data and they are required to be included on a register kept by the Information Commissioner. The noble Baroness has outlined in part how this system works. If an order were made under Section 22, the Information Commissioner would have to decide whether the intended processing would be assessable processing and, if so, whether or not it would comply with the Data Protection Act. The Information Commissioner would have a period of 28 days in which to make his assessment, which he could extend by a further 14 days, and the data controller would not be able to begin any assessable processing during that period. The process is tough.
The noble Baroness is right that the Secretary of State has not, to date, made an order under Section 22, so the Information Commissioner is not currently required to consider the question of assessable processing. If the Secretary of State considers that any data sharing or matching under the powers in the Bill are likely to cause substantial damage or significant prejudice, he already has the power to make an order under Section 22 requiring a prior assessment by the Information Commissioner. That could happen without the amendment. We already have what the noble Baroness seeks.
The Government do not consider the proposed data sharing particularly likely,
“to cause substantial damage or substantial distress to data subjects”,
or otherwise significantly prejudice their rights and freedoms. The processing is a necessary measure for the prevention of fraud. As the noble Baroness identifies, however, if it were to cross the threshold of Section 22, we would have the power to use that process. The possible delay of 28 days or longer while the preliminary assessment took place could seriously hinder the fraud prevention purposes of the new powers.
Finally, this provision could also distract the Information Commissioner’s Office from focusing on those activities likely to cause the greatest harm to individuals, since it would be required in all cases to consider whether a notification for a data controller included any assessable processing. I know that the noble Baroness would not seek that and, as I said, the powers created by this clause are, and will continue to be, subject to the Data Protection Act. The Information Commissioner’s Office will continue to investigate whether data sharing and matching comply with the Act in the same way as it regulates all other data processing. There is therefore no need to include an express reference to Section 22 of the Data Protection Act in the Bill.
The difficulty is that, although the Minister says that we do not need this because the power is there, she then says, “We do not want to use the power because it will cause a 28-day delay and therefore, perhaps, undermine what we are trying to achieve; and it would distract the Information Commissioner from doing other work that might be more appropriate”. The issue is that these powers are there and are appropriate in this context. It focused my mind on what Part 3 of the Bill is all about, against a background—as my noble friend Lord Northesk said—of our being 10 years on from 1998.
I appreciate that this is an odd amendment; it is not in the general run of what we have been debating. It exists against the background of our feeling that the Government have repeatedly introduced measures without proper forethought—or, at least, without sharing it with the Committee. Because of advice that we have been given from those I take seriously, who have been involved in advising organisations on data sharing, processing and assessable processing within the terms of the Data Protection Act, on this rare occasion—I do not intend to pursue the matter on Report—I shall seek the opinion of the Committee.
Schedule 6 [Data matching]:
moved Amendment No. 110B:
110B: Schedule 6, page 64, line 22, leave out “(including the identification of any patterns and trends)”
The noble Lord said: I shall also speak to Amendment No. 112. Clause 65 provides for Schedule 6. Paragraph 2 of the schedule inserts a new Part 2 into the Audit Commission Act 1998.
New Section 32A provides for the Audit Commission to carry out data-matching exercises or to arrange for another organisation to do that on its behalf. New subsection (2) defines a data-matching exercise, while subsection (4) provides that such,
“assistance may, but need not, form part of an audit”.
Amendment No. 110B amends this definition to restrict it to,
“the comparison of sets of data to determine how far they match”,
excluding the identification of any patterns and trends.
New Section 32C in Schedule 6 provides that, where the Audit Commission thinks it appropriate, it may,
“conduct a data matching exercise using data held by or on behalf of bodies not subject to section 32B”,
which sets out the bodies that may be required to provide information to the commission in order to conduct a data-matching exercise including sensitive personal data.
The Explanatory Notes highlight that those voluntary bodies could include central government departments and some private sector bodies, such as mortgage providers. Can the Minister confirm whether that provision could provide access to the children's index or the national identity register? Is it correct that information disclosed to the commission for matching could then be disclosed to an unrestricted range of bodies for fraud detection and prevention purposes, or is there another statutory duty to disclose the information?
Amendment No. 112, our second amendment, would remove Section 32C in its entirety.
The amendments are intended to explore what the Government really mean by data-matching in this context and to probe the extent of data-sharing that they expect to take place, with all the questions that spiral from that. I hope that the Minister will take the opportunity to explain in some detail exactly what the Audit Commission does with the personal data with which it deals as part of the national fraud initiative. Is it all for audit purposes? We need to question what the Government might want the Audit Commission to do with access to a greater mass of personal data under the provisions. Should we consider why the Audit Commission should be empowered to match data that do not form part of an audit?
On one level, data-matching could involve little more than the comparison of two or more sets of data to see whether there are overlaps. For example, the commission compares data to identify whether someone is claiming two benefits that are supposed to be mutually exclusive—in the old days, whether they were claiming income support on one hand and unemployment benefit on the other. Payrolls are another example. Data on individuals can be matched to see whether someone is claiming benefits in one borough but working in another.
However, Liberty highlights that, in reality, the definition of data-matching goes much further. The Bill states that it is to include the identification of any patterns and trends. There are serious concerns that that is more akin to data-mining than data-matching.
As my noble friend Lady Anelay highlighted at Second Reading,
“the Bill could open the way for operations under which software was used to search several databases to identify suspicious patterns of activity that simply could not be spotted when the data were seen individually”.—[Official Report, 7/2/07; col. 736.]
In essence, the Bill enables what are commonly termed fishing expeditions—data-mining that does not have to be founded on any suspicion or intelligence that a person or company has done anything wrong.
The Government’s consultation acknowledged that there would be concerns about the legality of data-mining. They were right and have so far failed to convince commentators on the Bill that it is a proportionate measure. Liberty believes that data-mining, by its very nature, will not be targeted or the intelligence as well sifted as the Minister suggested at Second Reading. In fact, it has significant concerns about the Bill’s compliance with both Human Rights Act and Data Protection Act principles. As we have discussed, huge quantities of data could be analysed, and while that may help to identify a few criminals, is that enough justification to subject the majority of the innocent population to such measures?
As well as those points of principle, there are practical considerations. As the noble Lord, Lord Thomas of Gresford, said on Second Reading, there is no guarantee that any patterns or trends thrown up by data-matching are meaningful or significant. There is a considerable amount of luck—one could say chance—involved. Who will interpret the results of the data-mining: the Audit Commission or the organisation to which it releases the data? There are not that many steps from the trawl of data dictating who will be investigated because of their characteristics or behaviours and the justification that the Government need under Part 1 for a serious crime prevention order.
I understand that, in the presentation to our researchers last Monday, the Audit Commission explained that it merely matches the data; it is then up to the local authority, for example, to follow up that match to establish whether there has been a simple mistake or fraudulent use. How do, and how will, the Government ensure that there is adequate training for and checks on those who are provided with the data in interpreting and handling the information? What hoops do organisations have to jump through to check the status of the information? Will they stop an individual’s benefits and then check his status, or will they check and then stop his benefits? This, once again, goes back to the need for a code of practice across the board.
Will there be a process of complaint for individuals if the bodies of the Audit Commission get it wrong? I understand that twins, for example, especially those who have the same initials, can prove particularly tricky in these circumstances.
We need only look at the inaccurate results thrown up by any data-mining exercise conducted on our shopping practices by the likes of private sector bodies such as Tesco, based on loyalty cards, to see that data-mining is not infallible. When it leads to people being sent vouchers for a brand that they would never buy, it is merely an annoyance; if, however, it led to an innocent person being subjected to a police investigation or preventive measures, the personal cost would be much greater and would be unacceptable.
Even if it were acknowledged that the investigation was a mistake, would not the record that there had been an investigation be kept on file? Would that record link into the national identity register, for example, so that those using it to verify personal details would see that someone had been investigated, regardless of the fact that it was an error or that the person was found innocent? Again, at this point, the adage that mud sticks or that there is no smoke without fire would hold. It would certainly indicate the reaction that many people might have to such information.
We have in the past drawn a comparison with the more stringent regime of Germany. Data there may be mined only with the authorisation of the court—something that is missing here—and for the following purposes. First, there must be evidence that a crime may have been committed. Secondly, the crime in question must be serious and one of the specific criminal offences set out in the criminal procedure rules, such as the trafficking of drugs or weapons, endangering the safety of the public or creating risk to life or limb. Thirdly, the investigation of the crime would be seriously impaired if the public authorities were denied the right to carry out the data-mining exercise. What assessment have Her Majesty’s Government made of the regime in Germany, and what consideration did they give to applying similar stringent restrictions to data-mining in this country?
New Section 32C(7) in Schedule 6 enables a data-matching exercise to include,
“data provided by a body or person outside England and Wales”.
Whom do the Government have in mind in that provision? Would data from anyone anywhere in the world be included? Could the body or person in theory include offshore bank accounts or even organisations such as the CIA? Will an organisation have to meet any criteria before it can take part in the voluntary provision of data under subsection (7)? Will an organisation have to adhere to rules on what it does with the data once they are matched?
There are concerns that parliamentary approval for data-mining in the context of protection against fraud will be open to function creep, an area where, dare I say it, the Government’s record does not inspire trust. There are concerns that such approval will be treated as a green light for the use of data-mining processes in many other contexts. I hope that the Minister will be able to address these and other concerns, which will no doubt be expressed by Members on other sides of the House. I look forward to hearing her reply. I beg to move.
We on these Benches very much agree with this group of amendments, as is obvious from the fact that the name of my noble friend Lord Dholakia joins mine on Amendment No. 110B. Liberty suggested Amendment No. 110B to us, as it may have done to the noble Lord, Lord Henley. Liberty makes the extremely valid point that it would amend the definition of data matching so that it is restricted to the comparison of sets of data to determine how far they match. It would no longer be defined as including,
“the identification of any patterns and trends”.
One of Liberty’s greatest concerns about the privacy implications of the Bill relates to the data-matching provisions in Schedule 6. These would give the Audit Commission the power to conduct data-matching exercises, or to make contracts with other bodies, public or private, on its behalf. It would require bodies subject to audit by the commission to provide information for the purposes of these exercises. Schedule 6 would also empower bodies whose accounts the commission does not audit to provide information for the purposes of data matching. That could include central government departments which, under these provisions, could theoretically—I repeat, theoretically—provide access to the children’s index or the national identity register. Private bodies such as banks, insurance companies and building societies will also be able to provide client details under Schedule 6. Information disclosed to the commission for the purposes of data matching and the results of those fishing expeditions could be disclosed to an unrestricted range of bodies for fraud detection or prevention purposes, or if there is another statutory duty to disclose the information.
This amendment seeks to explore what is meant by data matching in this context. We hope that the Government will explain what the Audit Commission is currently doing with the personal data of millions of people as part of the national fraud initiative, and describe what it might do in the future with the mass of personal data that has been collected and shared. Data matching could at one level involve little more than the comparison of two or more sets of data to see whether there are overlaps, which could identify someone who is claiming two benefits that are mutually exclusive, as the noble Lord, Lord Henley, has said.
Data mining involves the use of specialist software to profile innocuous mass data in order to identify patterns or characteristics that might indicate some sort of unusual behaviour or impropriety. It is essentially a fishing expedition which is not based on any suspicion or intelligence that a particular person or company has done anything wrong. The way in which data mining works can be illustrated with this hypothetical example. The Government want to crack down on tax evasion and think that the following factors are strong indicators that the person is engaged in this: regularly paying with cash rather than credit cards or cheques, having erratic streams of income, and taking extravagant holidays. They set up a computer program to search all bank account statements, local authority and central government records, and travel operator databases to identify these types of behaviour. The computer produces a list of every person who satisfies all three indicators and they are then subject to investigations by HM Customs and Excise.
At Second Reading, the noble Baroness, Lady Anelay, commented that,
“the Bill could open the way for operations under which software was used to search several databases to identify suspicious patterns of activity that simply could not be spotted when the data were seen individually”.—[Official Report, 7/2/07; col. 736.]
The consultation preceding the Bill acknowledged that there would be concerns about the legality of data mining. We believe that this would raise difficulties over compliance with DPA principles, and that, in human rights terms, proportionality issues will arise from the fact that data mining by its very nature will not be targeted or intelligence sifted. In order to be effective, huge quantities of data will have to be analysed. Data mining may well help to identify some people involved in fraudulent activities. But can identifying a few criminals justify the state trawling through all our personal data? We do not see how this kind of random, computerised fishing expedition into personal data can be proportionate.
Data mining can also give rise to serious practical concerns. At Second Reading, my noble friend Lord Thomas of Gresford said:
“It is the sort of thing that the supermarket card is designed to do to demonstrate to the management whether a customer buys buy tins of salmon or jars of Marmite. The patterns of behaviour thrown up by the data matching in Part 3 may or may not be meaningful; it is all a matter of chance. Depending on how they are interpreted, the Audit Commission will be able to point the finger at what is deemed to be a suspicious constellation of characteristics or behaviours in an individual. Instead of a system in which a person is suspected of a crime and is then investigated by the police, a trawl using the latest computer techniques will throw up names and those people will be investigated because of their characteristics or behaviours. Suddenly, we have grounds for a serious crime prevention order under Part 1”.—[Official Report, 7/2/07; col. 738.]
Many of us have experience of the inaccurate results thrown up by data-mining exercises conducted into the information held about our shopping practices on supermarket loyalty cards. Data mining is clearly not infallible. Where it leads to a person being sent vouchers for a brand they would never use, the data-mining error is merely an annoyance. If, however, it leads to an innocent person being subjected to a police investigation or to a preventive measure like a gangster ASBO, the personal cost will be much greater and the risk of error therefore unacceptable.
Given the principled and practical concerns, it is not surprising that other countries impose far more stringent safeguards on the ability of the state to mine personal data. I will not repeat what the noble Lord, Lord Henley, said about the German experience, but German law imposes even greater restrictions on the use of data mining to identify potential future behaviour. Liberty and Members on these Benches are concerned that no equivalent legal restrictions on data mining exist under UK law. We fear that parliamentary approval of data mining in the context of fraud prevention could be treated as a green light for the use of data-mining processes in many other contexts.
Amendment No. 112 would remove proposed new Section 32C of the Audit Commission Act 1998. New Section 32B requires,
“(a) a body subject to audit,
(b) an English best value authority”,
such as a county council or county borough council,
“to provide the Commission or a person acting on its behalf with such data (and in such form) as the Commission or that person may reasonably require for the purpose of conducting data matching exercises”.
New Section 32C broadens that so that data held by or on behalf of a person not subject to audit or which is a best-value authority may be disclosed to the commission. That, we believe, goes very wide indeed.
The noble Lord, Lord Henley, mentioned welfare fraud, while the noble Lord, Lord Burnett, mentioned tax evasion. When the Minister replies, can she confirm or deny that they are correct in their suspicion that these are indeed matters that will be disclosed as a result of data matching? Can she also say whether other kinds of fraud or serious crime are intended to be discovered through this process? It seems that there are quite genuine fears that an enormous amount of activity will be generated through data matching which, in general, will invade the personal privacy of individuals. There is also a risk that the process may lead to a reversal of the burden of proof on individuals or companies. Are these well grounded fears?
I hope to be able to reassure Members of the Committee that these are not well grounded fears, and I shall seek to explain why I say that. I take from the way in which the noble Lord, Lord Henley, moved Amendment No. 110B that he accepts that it would remove the ability of the Audit Commission to identify patterns and trends when carrying out a data-matching exercise, and that he has tabled the amendment more to explore the issue rather than necessarily to have that effect. I know that the noble Lord, Lord Burnett, sees the amendment slightly differently, and I accept the difference in the two approaches. Amendment No. 112 would remove the possibility of voluntary participation in the national fraud initiative by bodies not currently subject to the Audit Commission’s inspection or audit regime. Again, I have assumed for the purposes of this argument that that is not the purpose, but that we are simply looking at the way in which these structures will operate.
Perhaps I should remind the Committee that these processes have already proved extremely successful and useful in the identification of fraud. The Information Commissioner has made it clear that he recognises that privacy issues must be balanced against the prevention and detection of fraud, so that is already the position. We know that fraud is a real problem in a broad spectrum of cases and areas. The last national fraud initiative cycle identified £111 million, and CIFAS reported fraud avoidance losses of £790 million in the past year. We need to keep these two issues in proper balance. The noble Lord, Lord Henley, says we could remove the definition, but we think it must stay as it is if data-matching exercises are to be fully functional and deliver the results needed.
At its core, data matching is about comparisons of sets of data to see how far they match. It is an important technique that enables individual fraudsters to be identified. However, data matching is about more than just looking at data on a micro level. It can also identify patterns and trends that are indicative of fraud which may be occurring on a systemic or organised basis; for example, a ring of actual fraudsters operating in concert across a geographic area. That can prove vital in informing affected authorities that they need to guard against a new and emerging risk and providing them with the information they need to co-ordinate their efforts in tackling the problem. That is one example of why the definition should remain as it currently is.
As noble Lords know, the Audit Commission chooses data sets on the basis of its experience of those areas where fraud is prevalent or includes serious financial loss. In addition, the commission also acts where it is advised by bodies of emerging areas of risk. I should add that the identification of patterns and trends also plays a crucial role in ensuring proportionality, in that it enables the commission to monitor whether or not certain fields of data are still in fact useful in detecting and preventing fraud, and whether new risks are emerging. If matches no longer arise in relation to a particular type of data, the commission can identify that and cease to collect it. If its pilot exercises show an upward trend in actual matches, the commission will know that there is a risk area where it should be focusing its efforts. In each case, it is the trend of actual matches, indicative of a certain level of actual fraud, which informs its activities.
The commission’s decisions are communicated to each participating body, with detailed guidance. That information is put on the website so we can see what is happening. The commission has no desire to collect more data than necessary, not only in the interests of complying with data protection and human rights legislation but also because processing data unnecessarily is an uneconomic and inefficient use of its resources. That provision enables the commission to target its efforts with greater acuity. Without the ability to identify patterns and trends, it is difficult to see how the Audit Commission could discharge its duty to conduct the national fraud initiative in a proportionate way.
Having said that, I am happy to reassure the Committee that the Audit Commission has no intention of using data matching for the purpose of profiling individuals likely to commit future crime, in the way that has been suggested by a number of noble Lords in this debate today and earlier. That is simply not what the national fraud initiative is about.
It is likely that a trend in nature of behaviour is identified. Once it is identified, that information will be passed to the relevant authority, which will then look at the matches and the areas that have been so identified and investigate whether further inquiry is necessitated by the information. The Audit Commission will not keep the detailed information after it has used its data sets—it really is setting out the trends: “Where do we have to look and what have we got to look for?”. We know the success that this process has had by virtue of the amount of fraud which has thereby been identified. This method has proven very successful in assisting the agencies and the authorities which are given responsibility for making the targeted inquiry into areas where they should properly go. It also enables a local authority, for example, identifying a possible pattern of fraud, to change its systems and, in doing so, to make that system more resilient to potential fraud. That is why, as I said earlier, the commission is also looking at areas where there is no longer activity and where matching is not occurring, so that we know that that method of committing fraud has perhaps stopped and that people may have moved on to something else once a gap is filled.
The Audit Commission will use its findings to determine which sets of data are most productive in identifying fraud and focus on them, because making sure that we target resources appropriately is an issue. It will also advise the participating bodies where they are getting a lot of matches and therefore need to tighten up their system. The Audit Commission has found that that has been extremely useful to all concerned.
The noble Lords, Lord Henley and Lord Burnett, asked about the children’s index. The provision is very unlikely to involve disclosure from the children’s index as it is hard to see how this could assist the fight against fraud, which is the determining factor. However, I am very happy to look into that issue and write to noble Lords as appropriate.
As I have made clear, disclosure is authorised to anybody provided that it is legitimately required for the prevention and detection of fraud. This will include the participating bodies and, if appropriate, the police. As we discussed earlier, there is a criminal offence for further wrongful disclosure. Mention was made of patients’ data. Patients’ data are not disclosable beyond the National Health Service.
Amendment No. 112 would remove new Section 32C of the Audit Commission Act, which provides for the voluntary provision of data to the Audit Commission for use in the national fraud initiative. It would ensure that the only bodies to participate in the national fraud initiative would be those that are currently within the commission’s audit and inspection regime. This would defeat one of the key objectives of the new legislation, which is to build on existing mechanisms to tackle fraud across the public sector. This would be particularly unfortunate in relation to the national fraud initiative, which, as I hope I have adequately explained, has a demonstrable track record in the fight against fraud. The amendment has implications, not just for the effective prevention and detection of fraud in England, but also for the UK generally, as Scotland, Wales and Northern Ireland may, in due course, wish to provide their data to the Audit Commission under this provision. The amendment would prevent any such benefits being achieved.
New Section 32C does not enable bodies to provide data to the commission on a carte blanche basis. Rather, the Audit Commission must first be satisfied that it is in fact appropriate for it to accept this data for the purposes of its data-matching exercises. This in turn will depend on the Audit Commission’s experience of its use in the fight against fraud, suitably informed where necessary by pilot exercises. That is the whole purpose of having the pilots—so that can better be identified. Secondly, no one will be able to provide the commission with patient data on a voluntary basis. These safeguards ensure that only the minimum of relevant data will be included in data-matching exercises.
It is important also to consider the practical consequences of removing this section. First, it would mean that the Audit Commission could match benefit data that were administered by local authorities—for example, council tax and housing benefit—but would not be able to match national benefit data that are administered by the Department for Work and Pensions. Such gaps raise the obvious but avoidable risk that fraudsters will slip through the net. Secondly, under this amendment we would also lose the valuable contribution that private sector bodies can make in reducing fraud in the public sector. Early indications suggest that significant numbers of landlords with mortgages from private sector companies are fraudulently claiming housing benefit from local authorities. It would be extremely unfortunate if we missed the opportunity to engage the private sector in helping to strengthen the clasp on the public purse.
The noble Lord, Lord Henley, asked whether the commission had a complaints system for those who have wrongfully been investigated. It does—and, in the eight years of running the national fraud initiative, the commission has not had a complaint of wrongful investigation. That gives us some assurance that the commission has worked with propriety and gives strength to the assessment made by the noble Baroness, Lady Anelay, of its competence and probity.
The noble Lord, Lord Burnett, raised issues about data mining, which would allow profiling of individuals whose behavioural characteristics were indicative of a propensity to commit fraud in future. I hope that I have reassured him that that is not the intention and that the commission will look only at patterns and trends to identify emergent risks at the overall systems level, not to identify individual propensity to commit fraud in future. Matches disclosed relating to individuals disclose only current analysis indicative of current fraud. I know that the noble Lord would think that was a good as opposed to a bad thing, and I can reasonably anticipate that the noble Lords, Lord Henley and Lord Hylton, and the noble Baroness, Lady Anelay, would concur with that.
Of course, I hear what the noble Lord, Lord Hylton, says about the concerns that have been expressed. He sought reassurance, and I hope that I have explained how the system works and that the commission matches the issues in ways that are appropriate. To address welfare fraud against local authorities is important, and these provisions would allow national welfare fraud to be identified, as well as tax fraud, provided that the relevant government departments volunteered the information. They are not obliged to use this; it is a facility that would be available to all, which we think has real merit.
I have done what the noble Lord, Lord Henley, asked me to do; namely, to take these issues very seriously and try to answer as fully as I can in the hope that we shall not have to return to them.
I wish to pick the noble Baroness up on a couple of points that I did not hear answers to. It is clear that, if an individual is identified as a result of data-matching, their confidential data will be disclosed. It may be a case of matching benefit data with bank account data, looking for certain patterns of movements of money out of an account after it has been received as a benefit to indicate that that person is likely to have more than one identity in the benefit system. The agency taking action will need to know about those bank accounts and the data that go with them. Those are fairly sensitive data.
As I said previously, a current example is the licence fee and the purchase of television receivers. The authorities are notably insensitive in dealing with matches where they find a television receiver being purchased without a licence. They write extremely rude letters and can get quite harassing. People may be sent letters about the state of their bank accounts and be asked to justify what is going on. They may suddenly find that their personal details have been disclosed. Sometimes it will be a right match and there will be something there, but on other occasions someone will be left feeling extremely upset.
What safeguards does the noble Baroness envisage will be imposed on an authority in possession of very private information when it undertakes an investigation? Will it have to keep its investigations private until it is assured that a fraud is being committed, or will people find themselves involved in an investigation and forced to justify perfectly innocent activities?
I hope that the Minister can respond to the following example. Tax evasion is fraud; it is as clear as that. I am sure that no Member of the Committee has any truck with tax evaders, not least because every pound evaded means a pound more for the rest of the law-abiding population to pay in taxation. Could there be monitoring and profiling of people who make visits to tax-haven countries? Would that trigger an inquiry into those individuals’ bank accounts to see whether they had paid for those visits and so forth? Would it lead to Her Majesty’s Revenue and Customs undertaking a detailed inquiry? The Minister assured us at Second Reading that there would not be trawls of this nature. Will she deal with that example and let us know what happens if patterns or trends are discovered? If it is decided to monitor visits to tax-haven countries, what would happen as a result?
The noble Baroness, as always, was clearly trying to be helpful. However, am I right in deducing from her reply that data-matching has been going on for a considerable number of years already, presumably on some legal basis or other? If that is the case, why are Clause 65 and Schedule 6 necessary? If I may say so, they are remarkably uninformative because they do not explain the purpose of all this activity. We have heard it confirmed that welfare fraud and tax evasion come within the scope of the measure, but I originally asked what other kinds of fraud and serious crime the Government were trying to catch.
First, I do not know whether the noble Lord, Lord Hylton, had the advantage of being with us throughout the debate today or on earlier occasions. At the beginning of this evening’s discussion, I sought to set out how the clauses in this part operate together, and to give a bit of the history on how the fraud initiatives were set out, the basis on which the Data Protection Act operated and how the Audit Commission worked with the fraud initiative together with the Information Commissioner. I invite the noble Lord to look at that in due course, because it may be of assistance.
In a short form, I will look at where we are now. Data matching has been undertaken by the Audit Commission through the national fraud initiative. It has identified, as a result, significant areas of fraud, and through that expertise it has been able to assist the various agencies to better protect their systems. It is really looking at systemic issues that the authorities and the agencies can address to make their systems more robust and more resilient to fraud. In addition, that has revealed trends and patterns that merit further investigation.
The noble Lords, Lord Burnett and Lord Lucas, emphasised their concern about improper use of personal data in a way which, by its essence, would be contrary to the Data Protection Act. I have sought to reassure the Committee that the Act, and the way in which it bites on those who seek to use personal data, will be fully functional in relation to those provisions. The matches that will be disclosed relating to individuals will only disclose current analysis indicating current fraud; they will not refer to speculative future patterns. The Audit Commission will not use the national fraud initiative to trawl for patterns and trends concerning individuals’ behaviour or conduct, only systems-based, non-personal information to identify emergent risks. That is the sole purpose to which they will put those data.
Data-matching is a well proven technique that is authorised under the powers given to auditors under the Audit Commission Act. We are now legislating to extend its benefits to new bodies and to put in place appropriate protections. One of our concerns, which is shared by the Committee, is that if we are to move forward it should not cause a diminution in the safeguards necessary to protect the data, and we should do it only on a basis that is safe, transparent and proper. We seek in the Bill to accept that those areas will be extended to further bodies, and we are saying that the safeguards that are currently in place for the restricted number of bodies should extend to all of those who will seek to undertake this work. That is proper; therefore, it has been of considerable assistance to us to, first, identify people’s concerns and, secondly, to be able to address them, because noble Lords are raising perfectly proper issues on which they seek reassurance.
These were very much the questions that the Government asked ourselves as we prepared for the Bill: what are the safeguards? How can we guarantee that the Data Protection Act will continue? What will be the role of the Information Commissioner, and how will it fit together?
I hope that I have been able to reassure the Committee that the national fraud initiative does not need or desire to look at the content of bank accounts—the noble Lord, Lord Lucas, was concerned about that—but it does check whether the account has been disclosed to the housing benefit authorities, for example. We are looking at the trend of fraud right across the piece. We believe that this is a productive and successful way of better identifying fraud in a way that is safe, proportionate and fair.
I thank the Minister for a pretty full reply. I am not sure whether it was quite full enough but we will get to that in due course.
First, the noble Baroness implied that there was some difference in motive over this amendment between us and the noble Lord, Lord Burnett, on the Liberal Democrat Benches. So far as we were concerned, it was largely a probing amendment, but we may want to come back to it at a later stage despite the fullness of the noble Baroness’s response.
Secondly, the Minister ended with a remark about housing benefit to my noble friend Lord Lucas. I do not underestimate the scale of the problem of fraud involving housing benefit or other social security benefits. It is now 15 years since I was responding from the government Benches on social security and I remember being grilled on a number of occasions by noble Lords about the alleged £2 billion a year being lost in social security fraud—a great deal of that sum was housing benefit. The figure under this Government is considerably greater than it was then. The noble Baroness shakes her head; I may put that question to her directly on another occasion. I understand how serious the issue is. We are behind the noble Baroness on that so long as we can get these things right.
Thirdly, we want to look very carefully at everything that the noble Baroness said. I asked many questions when I moved the amendment and I think I heard responses to some, but certainly not all, of them. My noble friend Lord Lucas did not hear answers to all his questions. I did not hear the noble Baroness deal with the comparisons that the noble Lord, Lord Burnett, and I made with Germany. She did not deal with the national identity register. No doubt she will deal with those later or we will raise them at a later stage.
The noble Baroness is the very model of assiduity in responding to all questions put to her. I am sure that she will write to us in due course. To be fair, I did not expect her to be able to answer the array of questions that I raised. I am grateful that she corrected me and said that she would respond in due course. I am also grateful for her response so far and I am sure that we will want to come back to this issue at another stage, just as we will want to consider one or two more amendments tonight—I see the government Chief Whip in the Chamber. The Minister did not go quite as far as we would have liked but we will come back on this. I beg leave to withdraw the amendment.
Amendment, by leave, withdrawn.
[Amendments Nos. 110C to 112A not moved.]
113: Schedule 6, page 68, line 5, at end insert—
“( ) The code of practice and any revisions to it prepared under subsection (1) shall be subject to the approval of the Information Commissioner; and, following his approval, to approval by affirmative resolution of both Houses of Parliament.”
The noble Lord said: I shall speak also to Amendment No. 114A. New Section 32F, which we have been discussing, provides that the Audit Commission must prepare and keep under review a code of data-matching practice. It sets out that all those bodies and other persons involved in the process must have regard to the code of data-matching practice and requires the commission to consult all bodies identified in new Section 32B(2) before preparing or altering the code of data-matching practice and such other bodies as the commission sees fit.
The two amendments aim to raise questions about the involvement of the Information Commissioner in the design and production of the code of practice. Amendment No. 114A inserts the words, “the Information Commissioner’s Office” into new Section 32F(2) to ensure that he is specified in the Bill as someone who must be consulted by the Audit Commission. I am sure that the noble Baroness will suggest that he is covered by the words,
“such other bodies or persons as the Commission thinks fit”—
although at Second Reading, she highlighted the fact that the national fraud initiative regularly consults the Information Commissioner and will continue to do so. In light of concerns expressed throughout debates on this part and the fact that it is confirming usual practice, it is important that he is named in person in the Bill. That way there can never be any remote possibility that the Audit Commission would fail to think of him.
Amendment No. 113 tries to address the issue in an alternative way and inserts a new sub-paragraph after paragraph (1). It would ensure that there are two hoops through which the code of practice has to jump before approval to enable checks that the data use to which it refers is justified and proportionate. First, the Information Commissioner would have to sign it off and then Parliament would have a chance to consider it. As we have discussed, the code of practice is a key component to ensuring the trust of the public in the Audit Commission and in other bodies which handle our personal data.
Is there not a conflict of interest for the organisation that has the responsibility of doing that data matching in drafting the code of practice? I believe that the Information Commissioner should have more control over the code of practice rather than merely being consulted. That would ensure that it is produced in an independent and transparent manner. If the Information Commissioner has no control over the content of the code, how can he protect data subjects properly? I beg to move.
I speak to Amendment No. 114A which stands in my name and that of my noble friend Lord Dholakia. We believe that it is extremely important for the Information Commissioner to be consulted on these matters and that they are agreed with her or him. The Government have acknowledged that. At Second Reading, the noble Baroness, Lady Scotland, explained that the existing national fraud initiative,
“operates to a code of practice, on which the Information Commissioner has been consulted, and that will continue to be the case”.
She stated that,
“working with the Information Commissioner, we will be seeking to ensure that the arrangements are transparent and command public confidence, are proportionate and are subject to periodic review”.—[Official Report, 7/2/07; col. 732.]
We believe that this very important role for the Information Commissioner should be a statutory requirement and should be on the face of the Bill.
I, too, support the amendment. It is a price well worth paying, given the intense interest, sensitivity and reservations expressed in the Committee and elsewhere. I hope that some commentators will not find it an irksome imposition. It is not: it would demonstrate that a strict discipline is being applied to what we are suggesting. It would therefore provide a platform to ensure that decisions are subject to scrutiny, as has already been said by noble Lords. Overall, the gain is considerable. It would not be an undue burden, and I hope to see it included in the Bill as it leaves for another place.
The amendment in the name of the noble Baroness, Lady Anelay, and the noble Lord, Lord Henley, suggests that the Audit Commission’s code of practice be subject to the approval of the Information Commissioner and the approval of both Houses by affirmative resolution. On the face of it, that is a reasonable proposition.
As a regulator, the Information Commissioner does not formally approve or agree codes of practice, as it is crucial that he retains his independence. Bodies drafting such codes are instead encouraged to consult the Information Commissioner on the contents of that code; that is what the Audit Commission does. Indeed, the Information Commissioner provided a foreword to the current code of practice which the Audit Commission has prepared for its national fraud initiative.
The Information Commissioner already has a comprehensive range of powers on the bodies that will be sharing or matching data and the work that they undertake. He can make assessments as to whether or not they are complying with any of their duties under the Data Protection Act. He can issue notices requiring these bodies to furnish him with any information he may specify for that purpose. He also has enforcement powers to rectify incidents of non-compliance, and the ability to prosecute individuals for offences under the Act. Adequate safeguards are in place, with the combined effects of the Data Protection Act and the regulatory regime of the Information Commissioner, the European Convention on Human Rights and the additional safeguards which have been incorporated into these provisions.
The noble Lord, Lord Henley, made a point about a conflict of interest with the commission drafting its own code. It is common practice for bodies to prepare their own codes, which the Information Commissioner encourages. He actively involves himself in existing voluntary codes and will quickly say if one is not up to scratch or he is dissatisfied with its drafting. The Information Commissioner provides helpful guidance in that regard.
Amendment No. 114A, proposed by both opposition Benches, suggests that there should be a specific statutory duty on the Audit Commission to consult with the Information Commissioner when preparing the code of data-matching practice. We have earlier had considerable discussion on Section 51 of the Data Protection Act, which requires the Information Commissioner to promote both good practice and the observance of the requirements of that Act. He has a duty to prepare codes of practice where he considers it appropriate and is therefore already charged with regulation in this area. It is therefore preferable to leave this matter to the good sense and operational effects of the Information Commissioner’s work, who the Audit Commission would always seek to consult. On this basis, the amendment is unnecessary. Consultation will be undertaken in any event. The oversight and safeguards noble Lords seek on governing the data sharing and matching already exist, so we hope that the amendments can be withdrawn.
I thank the noble Lord, Lord Dear, for offering his support, particularly at this late hour. I also thank the Minister for responding on behalf of the Government. Over the years, I have heard a great many government responses to amendments—I have also given a great many in my time—and I can recognise the ones that have written at the top of them, as most of them do, the word “resist”. That response did, despite the fact that these are very small amendments that ask for one very small thing: the greater involvement of the Information Commissioner. We tabled them because we believe that it is important for the Information Commissioner to be named in the Bill; if he is not, the Bill might be interpreted to mean that the mere fact that he is not mentioned means that he ought to be excluded. I am sure that that is not the intention of the Government and that the Audit Commission would always want to involve him, but I think that it would be better if the Government mentioned the Information Commissioner in the Bill.
We will leave the amendment in that form at this stage so that over the ensuing weeks the Government can ponder this matter to see whether they wish to include the Information Commissioner. They may bring forward their own amendment on Report to include him or her, as proposed by us and the Liberal Democrats and as supported by the noble Lord, Lord Dear. This is something that we will probably want to come back to. I beg leave to withdraw the amendment.
Amendment, by leave, withdrawn.
114: Schedule 6, page 68, line 5, at end insert—
“( ) No data matching shall take place unless subject to an agreed code of practice.”
The noble Lord said: I hope that Amendment No. 114 will be the last amendment that we take this evening. There is no need for the noble Baroness to pass over a glass of water for my cough, as I have one here already, but I am very grateful to her for her kind thoughts, and I will try not to create a by-election, as the noble Lord, Lord Bassam, seems to be suggesting.
This amendment would insert a new subsection after new Section 32F(1), to be inserted by Schedule 6 to the Bill. It does what it says it does: it prevents data matching taking place until the code of practice under this section has been agreed. This amendment ties in well with our debate on the detail of the code of practice and with earlier debates on Amendments No. 109 and 111, which were tabled by my noble friends Lord Lucas and Lady Anelay.
Section 32F(2) states:
“Regard must be had to the code in conducting and participating in any such exercise”.
However, it is not clear whether data matching can occur before or during the design of the code. I hope that the Minister will confirm that that is not the case. It would not do for data matching to occur outside the code, especially when we have concerns about the code itself.
I reiterate that we on these Benches want to ensure that the Information Commissioner has some control over the code of practice and that it is approved by Parliament or, at the very least, that it is an agreed code. As your Lordships’ House is very much aware, a duty to consult is not the same as a duty to take on recommendations that are made. While I am sure that, in practice, people would be somewhat foolish not to take into account any comments made by the Information Commissioner, one cannot guarantee how future versions of the Audit Commission may act in that regard. It is essential that we ensure that data matching by the Audit Commission is carried out in a transparent, proportionate and accountable way that provides protection for the individual. I beg to move.
We support the amendment. It tightens up the provisions on data matching by ensuring that it can take place only under an agreed code of practice. It goes some small way to balancing the interests of the state with the interests of the public. That is why we support it.
I am grateful that the noble Lord, Lord Henley, was able to finish speaking to the amendment; we were worried for a moment that he was going to make himself unwell on the subject. I am glad that is not the case. As he explained, Amendment No. 114 suggests that there should be an express requirement that data matching should not take place unless it is subject to an agreed code of practice. For reasons that we rehearsed when examining earlier amendments, and although we understand that point of view, we must continue to resist that approach and in particular this amendment.
I can give the noble Lord one reassurance. It is already implicit that data matching can be undertaken only if a code of practice is in place. New Section 32F places a duty not only on the Audit Commission to prepare a code but on everyone involved in data matching to have regard to it. New Section 32F also requires the Audit Commission to consult participating bodies when drafting this code of practice and any other bodies as the commission thinks fit.
We have already discussed the role that the Information Commissioner played in creating the existing national fraud initiative code of data-matching practice. We therefore do not think that this amendment is necessary. It may also be worth underlining the point that if we were to go in the direction in which the noble Lord invites us, the Information Commissioner himself might have concerns about whether his independence was being compromised. I know that the commissioner greatly values that independence. That thinking greatly informs our approach to these amendments, which we think are on the wrong side of the line. We think that we have the balance about right. However, I am grateful to the noble Lord for his amendment. We understand where he is coming from with this amendment; it is simply that we disagree with its general approach.
I am again very grateful for the noble Lord’s concern for my health and his explanation that the amendment is not necessary. Although I am not satisfied with his explanation, I will console him by saying that I will not divide the Committee at this stage of the evening on this subject. He looks relieved at that. However, we have concerns about the matter and will want to come back to it at a later stage. I will end, and I hope end the proceedings for the night, by saying that he will hear more of this in due course. I beg leave to withdraw the amendment.
Amendment, by leave, withdrawn.
[Amendment No. 114A not moved.]