MPs Discuss live face Recognition for the first time.

For the first time since the Met Police first used live facial recognition technology in August 2016, MPs have engaged in an open discussion about police use of this technology.

For the first time, MPs have discussed the use of live facial recognition (LFR) by police, and they have come to the conclusion that it should be governed by a single statute rather than the patchwork of laws and official guidelines that presently control police deployments.

The impact of LFR surveillance on privacy, issues with bias, accuracy, and racial discrimination, the absence of a clear legal framework governing its use by police, and how its wider roll-out could further erode public trust in police were among the many topics covered by MPs, including members of both front benches, during the November 13, 2024, debate in Westminster Hall.

Although there were disagreements over the effectiveness of LFR as a tool for combating crime, MPs generally concurred that there are justifiable worries about police use of the technology, and agreement was reached regarding the necessity of appropriate regulation.

The majority of MPs who participated in the discussion also publicly bemoaned the lack of discussion around police usage of the technology thus far.

MPs discuss live face recognition for the first time.

Although LFR has been the subject of little parliamentary scrutiny over the years in the form of written questions and answers, the debate, which was called by Conservative MP John Whittingdale, is the first time lawmakers have publicly addressed police use of LFR in the eight years since the Metropolitan Police first used it at the Notting Hill Carnival in August 2016.

Parliament and civic society have repeatedly called for additional legislative frameworks to regulate police enforcement’s use of the technology since its initial deployment. These include the UK’s Equalities and Human Rights Commission, two former biometrics commissioners, Paul Wiles and Fraser Sampson, an independent legal review by Matthew Ryder QC, three separate investigations into shoplifting, police algorithms, and police facial recognition by the Lords Justice and Home Affairs Committee (JHAC), and the House of Commons Science and Technology Committee, which as early as July 2019 called for a moratorium on LFR.

Prior to his resignation in October 2023, Sampson also brought attention to the wider “culture of retention” in UK policing regarding biometric data and the lack of clarity surrounding the scope and magnitude of monitoring in public spaces.

Nonetheless, the Home Office and law enforcement agencies have consistently argued that a “comprehensive legal framework” already exists, comprising the Police and Criminal Evidence Act (PACE) 1984, the Data Protection Act 2018, the Protection of Freedoms Act 2012, the Equality Act 2010, the Investigatory Powers Act 2000, the Human Rights Act 1998, and common law powers to deter and identify criminal activity.

Policing Minister Diana Johnson summarised the new Labour government’s stance on police LFR use as the debate came to a close. She stated that although the technology has “the potential to be transformational for policing,” there are “legitimate concerns” about its use, “including misidentification, misuse, the effect on human rights, and individual privacy.”

Johnson added that the Met’s use of LFR has led to 460 arrests so far this year, including over 45 registered sex offenders who violated their conditions. It is crucial to remember that facial recognition is already subject to common law powers, data protection, equality, and human rights laws, as well as guidelines from the College of Policing.

This administration wants to give serious consideration to the issues brought up and how we can best allow the police to employ live face recognition in a way that protects and upholds public trust,” she stated.

“We must weigh privacy issues against our expectations of the police to maintain the safety of our streets when evaluating its current and future uses. Therefore, I’m determined to enroll in a program over the next few months to help shape this way of thinking.

According to Johnson, the government will hold a number of roundtables with regulators and civil society organizations before the end of the year to inform its future thinking. This follows early discussions with policing where senior officers stated that the absence of a clear framework was preventing them from using the technology.


Accuracy issues

Shadow Home Secretary Chris Philp, who advocated for much greater police use of the technology during his time in government and called for LFR watchlists to be connected to the UK’s passport database, brought attention to a recent National Physical Laboratory (NPL) study that found “no statistically significant” racial bias with police LFR systems when used in specific settings.

“There were reports—true reports—that the algorithm at the time was biased against people of color when this technology was first introduced, roughly seven years ago,” Philp added. “The National Physical Laboratory, the country’s top testing facility, has conducted conclusive tests on the algorithm, which has undergone significant development since those days.”

He said that the NPL specifically discovered that by setting the “face-match threshold” to 0.6 (zero being the lowest similarity and one being the highest similarity) both the Met and South Wales Police could achieve “equitable” outcomes across gender and ethnicity when using the Neoface V4 facial recognition software supplied by the Japanese biometrics company NEC.

Other MPs, however, disagreed, pointing out that although police may have increased accuracy by employing LFR at that specific level, there are no regulations preventing them from decreasing it at any point.

Labour MP Dawn Butler stated that even with the LFR software set at 0.6, it was still less accurate than so-called police “super-spotters,” who are specialized officers trained to quickly identify people in crowds. “There’s no such thing as no misrepresentations or people who are not wrongly identified, and it’s also very easy for a police service to lower that number because we have no judicial oversight of it,” she added.

“There may be an instance when a police department is attempting to demonstrate that the system they purchased is cost-effective. For example, if a police officer is not receiving many hits at 0.6, they may reduce it to 0.5 in order to receive more hits, which may result in more persons being misidentified.

Regulations should be in place on this matter. One of the most serious things we can do in society is to take away someone’s liberty, so if we’re going to do anything that makes that happen more quickly, we need to consider it very carefully.

Racial bias and trust

Even with this criteria, Lambeth Labour MP Bell Ribeiro-Addy contended that mistakes can happen, particularly if photos taken from publicly accessible websites or police databases are incorrectly labeled.

“It’s nearly a given that pictures will be incorrectly labeled and that innocent individuals will have unnecessary encounters with the police. When attempting to identify women and people of color, the Metropolitan Police’s own testing of its facial recognition algorithm revealed disproportionately higher inaccuracy rates, she said, citing a 2023 study by civil liberties group Big Brother Watch that discovered that more than 89% of all UK police LFR alerts since the technology’s introduction have incorrectly identified members of the public as people of interest.

The deployment of potentially defective technology will only serve to raise the number of stops, searches, and maybe even erroneous detentions of ethnic minorities, who are already disproportionately stopped and searched at greater rates.

“Why should we look at this intrusive, automated biometric software any differently?” she added, adding that allowing police to gather and check other types of biometric information like DNA or fingerprints from people in the street would not be acceptable and that increasing stops via LFR could further erode trust in the police, especially among ethnic minority communities that are already over-policed and under-served.

Independent MP Iqbal Mohammed emphasized how Google’s incognito browser was meant to be “very private” until it was found the company was storing that data in violation of UK data laws, despite police and their LFR suppliers’ claims that people’s biometric data is immediately erased if it does not match any images in the watchlist. “It’s not always true when companies tell you that things are deleted right away,” he remarked.

According to various MPs, the continued implementation of LFR would “undermine several of our fundamental rights,” such as the rights to privacy, freedom of assembly and expression, and non-discrimination; “exacerbate existing inequalities and discrimination”; and “cause further division and mistrust of the police.”

Judicial oversight and specific legislation

Conservative MP David Davis said that the technology should not be left to police discretion or non-statutory norms, and he emphasized the necessity for judicial monitoring of the technology as well as particular law outlining clear regulations for its usage.

The future of LFR was also discussed by a number of MPs, who argued that regulations must be in place before the technology is used even more extensively. They raised concerns about mission creep and the prospect for connecting the six million CCTV cameras in the UK to face recognition software.

“The technology has a tendency to slip.” We first implemented automated number plate recognition (ANPR) to keep an eye on IRA militants traveling from Liverpool to London back in the day. That was its precise function, but without any parliamentary sanction or statutory amendment, it was subsequently utilized for a dozen additional purposes, according to Davis.

Mohammed said that there was a great deal of leeway for police abuse and excess due to the absence of clear laws and judicial control, and he emphasized how this might be used to subvert civil freedoms, particularly the right to demonstrate. “The fear of being watched or recognized can discourage people from taking part in demonstrations or public events because of facial recognition,” he added.

Philp acknowledged that LFR accuracy has significantly increased since the Met’s initial deployment in 2016 and acknowledged the operational advantages it can provide for law enforcement, but he also emphasized the necessity of judicial and legislative monitoring of the technology.

He stated, “In Croydon, [LFR] has led to about 200 arrests of people who would not have been arrested otherwise, including for cases like domestic burglary, fraud, Class A drug supply, grievous bodily harm, and a man who had been wanted for two rapes.” “Without this technology, they would still be free.”

Although it is “not true to say there is a complete vacuum as far as rules and regulations are concerned,” Philp explained, “there is merit in clarifying at a national level where these guidelines sit.”

According to him, employing some form of “regulation-making power” is a more sensible course of action than passing new primary legislation, even though he would not want to see the UK follow the European Union’s lead with its Artificial Intelligence Act (AIA), which forbids remote biometric identification in a number of situations on the grounds that it would free criminals.

Liberal Democrat Bobby Dean, one MP, specifically demanded that police use of the technology be completely stopped until at least primary laws governing its use is in place: “I think it’s obvious from this room today that there are a lot of questions, so we should probably consider stopping the use of this technology until those questions have been answered.”

Leave a Reply

Your email address will not be published. Required fields are marked *