Essex Police suspend use of facial recognition cameras after study finds racial bias | facial recognition

Essex Police suspend use of facial recognition cameras after study finds racial bias | facial recognition

Essex Police have suspended the use of live facial recognition (LFR) technology after a study found the cameras were much more likely to be pointed at black people than people of other ethnicities.

https://omg10.com/4/10736335

The decision to suspend the use of AI-enabled systems was revealed by the Information Commissioner’s Office (ICO), which regulates the use of the technology so far deployed by at least 13 police forces in London, South and North Wales, Leicestershire, Northamptonshire, Hampshire, Bedfordshire, Suffolk, Greater Manchester, West Yorkshire, Surrey and Sussex.

The ICO said Essex Police had paused LFR deployments “after identifying potential accuracy and bias risks” and warned other forces to implement mitigation measures. LFR systems are mounted in fixed locations or deployed on vans. In January, Home Minister Shabana Mahmood announced the number of LFR vans would increase five-fold, with 50 available for each police force in England and Wales.

Essex commissioned academics from the University of Cambridge to carry out a studywhich involved 188 actors walking past cameras actively deployed from marked police vans in Chelmsford. The results were published last week and showed that about half of the people on a watch list were correctly identified and incorrect identifications were extremely rare, but the system was more likely to correctly identify men than women and was “statistically significantly more likely to correctly identify black participants than participants from other ethnic groups.”

Live facial recognition vans are increasingly available to police forces in England and Wales. Photograph: Andrew Matthews/PA

This “raises questions about equity that require continued monitoring,” the report concludes. One of its authors, Dr Matt Bland, a criminologist, told The Guardian and Liberty Investigates: “If you are a criminal going through facial recognition cameras that are installed like they are in Essex, your chances of being identified as part of a police watch list are higher if you are black. To me, that warrants further investigation.”

The problem differs from the most common public concern about the technology, which is that it identifies innocent people. Last month it emerged that police arrested a man for a robbery in a city he had never visited 100 miles away after retrospective facial scanning software confused him with another person of South Asian descent.

Possible reasons for the latest problem with LFR include overtraining the algorithm on black faces. Experts believe this could be fixed by adjusting the system settings. A separate study of the same technology by the government’s National Physical Laboratory. found Black men were more likely to be matched correctly by the system and white men less likely, but the effect was not statistically significant.

The Ministry of the Interior has saying LFR cameras installed in London from January 2024 to September 2025 led to more than 1,300 arrests of people wanted for crimes including rape, domestic abuse, robbery and grievous bodily harm. But opponents of facial recognition technology said the latest research showed warnings about bias in LFR technology were being borne out.

“Police across the country should take note of this fiasco,” said Jake Hurfurt, head of investigations at Big Brother Watch. “AI surveillance that is experimental, untested, inaccurate or potentially biased has no place on our streets.”

Essex Police has been contacted for comment.

Leave a Reply

Your email address will not be published. Required fields are marked *