Skip to main content
Advertisement

Essex Police Halt Facial Recognition Use After Study Reveals Racial Bias

Essex police have paused live facial recognition use after a University of Cambridge study found the technology was significantly more likely to identify black individuals, raising concerns about fairness and bias in AI surveillance systems.

·4 min read
Essex Police officer beside live facial recognition van in Chelmsford

Essex Police Suspend Live Facial Recognition Over Bias Concerns

Essex police have temporarily halted the use of live facial recognition (LFR) technology following a study that revealed the cameras were significantly more likely to identify black individuals compared to other ethnic groups.

The suspension of the AI-powered systems was prompted by the Information Commissioner’s Office (ICO), the regulatory body overseeing the deployment of this technology by at least 13 police forces across London, south and north Wales, Leicestershire, Northamptonshire, Hampshire, Bedfordshire, Suffolk, Greater Manchester, West Yorkshire, Surrey, and Sussex.

The ICO stated that Essex police paused LFR operations "after identifying potential accuracy and bias risks" and cautioned other forces to implement appropriate mitigations. LFR systems are either installed at fixed locations or operated from vans. In January, the home secretary, Shabana Mahmood, announced plans to increase the number of LFR vans fivefold, making 50 vans available to every police force in England and Wales.

a camera on top of a Live Facial Recognition van
Live facial recognition vans are being made available more widely to police forces across England and Wales. Photograph: Andrew Matthews/PA

University of Cambridge Study Highlights Bias in Facial Recognition

Essex police commissioned academics from the University of Cambridge to conduct an independent evaluation involving 188 actors who walked past cameras actively deployed from marked police vans in Chelmsford. The study’s results, published last week, indicated that approximately half of the individuals on a watchlist were correctly identified, with false identifications being extremely rare.

However, the system demonstrated a higher accuracy in identifying men compared to women and was "statistically significantly more likely to correctly identify black participants than participants from other ethnic groups."

Advertisement

The report concluded that this finding "raises questions about fairness that require continued monitoring." Dr Matt Bland, a criminologist and one of the study’s authors, told and Liberty Investigates:

"If you’re an offender passing facial recognition cameras which are set up as they have been in Essex, the chances of being identified as being on a police watchlist are greater if you’re black. To me, that warrants further investigation."

Concerns Differ from False Identification of Innocent Individuals

This issue contrasts with the more common public concern regarding facial recognition technology, which centers on the misidentification of innocent people. Last month, it was revealed that police mistakenly arrested a man 100 miles away after retrospective face scanning software confused him with another individual of South Asian heritage.

Potential Causes and Further Research

Experts suggest that the current bias may stem from the algorithm being overtrained on the faces of black individuals. Adjustments to system settings are believed to potentially rectify this issue. A separate study conducted by the government’s National Physical Laboratory on the same technology found that black men were most likely to be correctly matched by the system, while white men were least likely; however, this effect was not statistically significant.

Home Office Reports Arrests Amidst Ongoing Debate

The Home Office has reported that LFR cameras deployed in London from January 2024 to September 2025 contributed to over 1,300 arrests of individuals wanted for serious crimes including rape, domestic abuse, burglary, and grievous bodily harm. Nonetheless, critics of facial recognition technology argue that the recent research confirms longstanding warnings about bias in LFR systems.

Jake Hurfurt, head of research and investigations at Big Brother Watch, stated:

"Police across the country must take note of this fiasco. AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets."

Essex Police Respond to Findings and Plan to Resume Use

Essex police commented on their decision to pause LFR deployments:

"Based on the fact there was potential bias the force decided to pause deployments while we worked with the algorithm software provider to review the results and seek to update the software. We then sought further academic assessment.
As a result of this work we have revised our policies and procedures and are now confident that we can start deploying this important technology as part of policing operations to trace and arrest wanted criminals. We will continue to monitor all results to ensure there is no risk of bias against any one section of the community."

This article was sourced from theguardian

Advertisement

Related News