Cookies: We use cookies to give you the best possible experience on our site. By continuing to use the site you agree to our use of cookies. Find out more
House of Lords Hansard
x
Public Authorities: Algorithms
14 March 2019
Volume 796

Question

Asked by

The edit just sent has not been saved. The following error was returned:
This content has already been edited and is awaiting review.

To ask Her Majesty’s Government what consideration they have given to the standards and certifications required for the algorithms used in decision-taking by public authorities and agencies.

The edit just sent has not been saved. The following error was returned:
This content has already been edited and is awaiting review.

My Lords, last year the Government published the Data Ethics Framework, which sets out clear principles and standards for how data is used in the public sector—an important tool guiding the ethical use of algorithms and AI technologies. The Government have also recently set up the Centre for Data Ethics and Innovation, which will provide independent, expert advice on the governance of data and AI technology. The centre’s first two projects will study the use of data in shaping people’s online experiences and the potential for bias in decisions made using algorithms. This work and the centre’s future work will play a leading role in ensuring transparency and accountability in the ethical use and design of algorithms.

The edit just sent has not been saved. The following error was returned:
This content has already been edited and is awaiting review.

My Lords, some 53 local authorities and about a quarter of police authorities are now using algorithms for prediction, risk assessment and assistance in decision-making. The Centre for Data Ethics and Innovation, for all its virtues, is not a regulator. The Data Ethics Framework does not cover all aspects of algorithms. As the Minister will know, it was quite difficult finding a Minister to respond to this Question. Is it not high time that we appointed a Minister—as recommended by the Commons Science and Technology Committee—who is responsible for making sure that standards are set for algorithm use in local authorities and the public sector and that those standards enforce certain principles such as transparency, fairness, audit and explainability and set up a kitemark so that our citizens are protected?

The edit just sent has not been saved. The following error was returned:
This content has already been edited and is awaiting review.

My Lords, there was no difficulty in finding a Minister in this House: answering the noble Lord’s very sensible Question was pinned on me at a very early stage. The point about the Centre for Data Ethics and Innovation, which will publish its interim report on algorithms in the summer—relatively soon—is that it will look across the whole area and highlight what should be done in regulation terms. It will be one of the things that we expect the centre to look at, so the genuine concerns raised by the noble Lord can be considered at by this forward-looking body.

The edit just sent has not been saved. The following error was returned:
This content has already been edited and is awaiting review.

Would my noble friend explain what an algorithm is? Should I be concerned about it?

The edit just sent has not been saved. The following error was returned:
This content has already been edited and is awaiting review.

My Lords, I am not an expert, but I am sure that the noble Lord can go back to his school days and remember from his study of Greek that Euclid was producing algorithms in 300 BC —he will remember that this was for finding the greatest common divisor of two numbers. Essentially, an algorithm is a set of rules that precisely defines a sequence of operations. Today, they are used mainly by computers for calculations, machine learning and artificial intelligence.

The edit just sent has not been saved. The following error was returned:
This content has already been edited and is awaiting review.

My Lords, clearly, I must voice the general opinion expressed in other ways in appreciation of the Minister’s reply to a very pernickety noble friend of his, who is sitting on the Bench behind him. We have heard reports of information that will come from the data ethics people in the summer, and we have a White Paper on online harms coming very soon and then a period of consultation. I always seem to be stuck at the Dispatch Box acknowledging that the answer to the question I really want to ask will come in months’ or perhaps years’ time. The noble Lord who put the question is quite right: things are happening in the field of technology now, with all those local councils and police forces using algorithms to forecast possible courses of action and take policy decisions in light of what they think will happen. We are told that consultative experiences are about to happen, but is it “when” or “if”? It would be good if the Minister could somehow bypass or short-circuit the labyrinthine things that are happening elsewhere and give us some reassurance that certification for things which are already happening in the field and shaping our future can be looked at critically.

The edit just sent has not been saved. The following error was returned:
This content has already been edited and is awaiting review.

It is not completely fair to say that nothing has happened. In areas where personal data is used, for example, that has to be used lawfully under the aegis of the Data Protection Act. The Information Commissioner recently said that she was minded to issue guidelines on the use of data in respect of children. The Information Commissioner is a powerful regulator who is looking at the use of personal data. We also have the Digital Economy Act, and we have set up the Data Ethics Framework, which allows public bodies to use the data which informs algorithms in a way that is principled and transparent. Work is going on, but I take the noble Lord’s point that it has to be looked at fairly urgently.

The edit just sent has not been saved. The following error was returned:
This content has already been edited and is awaiting review.

My Lords, when the Chancellor asks the Competition and Markets Authority to scrutinise the transparency of Google and Facebook, are the Government confident that they are applying the same rules of transparency to public services in the UK? Is not waiting for an interim report a little bit too late, when the HART system used by Durham Police to predict reoffending, for example, is already well under way? Does the Minister accept that failure to properly scrutinise these kinds of algorithms risks the racial bias revealed by the investigation into the Northpointe system in Florida?

The edit just sent has not been saved. The following error was returned:
This content has already been edited and is awaiting review.

I understand that there are issues about facial recognition systems, which are often basically inaccurate. The essential point is that biometric data is classified as a special category of data under the Data Protection Act and the police and anyone else who uses it has to do so within the law.