Human rights regulator objects to UK police’s use of live facial recognition

News
Share


The UK’s human rights watchdog has criticized the Metropolitan Police’s use of live facial recognition technology (LFRT), arguing that the way it is being deployed breaches human rights law.

The Equality and Human Rights Commission (EHRC) stated that the Met’s current policy on the technology falls short of the necessary standards for necessity and proportionality.

LFRT works by scanning faces in real-time via CCTV and comparing them against a police watchlist of wanted individuals.

The Met maintains its use is lawful and has credited the technology with over 1,000 arrests since January 2024, including those for serious offences, such as rape and robbery. The force also plans to use the tech at major events, including the upcoming Notting Hill Carnival.

Despite these benefits, the EHRC is concerned that LFRT poses a significant threat to key human rights, including privacy, freedom of expression, and freedom of assembly. The regulator has been granted permission to intervene in an upcoming judicial review of the Met’s policy, arguing that without specific domestic legislation, the technology risks misuse.

Civil rights groups and privacy campaigners have consistently opposed LFRT, citing its invasive nature and a high risk of misidentification. Research has shown that facial recognition algorithms can have a bias, and is less accurate in identifying women and people of colour.

The Met, however, defends its use, asserting it’s an effective way to cut crime at a time of tight police budgets. The upcoming judicial review in January 2026 will be a key battleground in determining the future legal status of this controversial surveillance tool in the UK.

For latest tech stories go to TechDigest.tv


Discover more from Tech Digest

Subscribe to get the latest posts sent to your email.