Police’s facial recognition technology only accurate three quarters of the time

Share



A police force’s use of facial recognition technology requires “considerable investment” to deliver consistent results, a study has concluded.

Crashing computer systems and poor quality images are among the challenges South Wales Police officers have faced since rolling out the technology.

The force first deployed automated facial recognition (AFR) at the 2017 Champions League final and has since used it at autumn rugby internationals, an Anthony Joshua boxing match and music events.

Researchers at the Crime and Security Research Institute at Cardiff University found that large crowds, low light and people wearing glasses and hats hindered the technology’s effectiveness.

South Wales Police said the report provided “balanced perspective” and it remained committed to using AFR in a “proportionate and lawful way to protect the public”.

Developed by Japanese company NEC, the technology works in two modes: “Locate” scans faces in real-time from CCTV video feeds and searches for matches against a pre-selected watch-list; “Identify” takes still images from CCTV or mobile phones and compares them against the police custody database.

The use of facial recognition technology, which has also been trialled by the Metropolitan Police, has sparked debate around privacy and accuracy.

At the 2017 Champions League final in Cardiff, the technology wrongly matched more than 2,000 people to possible criminals.

Juventus v Real Madrid – UEFA Champions League – Final – National Stadium
The Champions League Final at the National Stadium, Cardiff, in 2017 (Nick Potts/PA)

In June this year Cardiff resident Ed Bridges launched a legal challenge against South Wales Police to drop the system after he said he was scanned at a protest event.

During the June 2017 to March 2018 evaluation period, Cardiff University researchers found the system improved as algorithms were changed.

Locate’s number of “false positive” matches dropped from 72% to 50% and Identify’s incorrect matches fell from 16% to 9%.

A field trial of the technology revealed it has the potential to identify a person of interest around 76% of the time.

But researchers also found that in 68% of submissions made by police officers in the Identify mode, the image was not of sufficient quality for the system to work.

A total of 18 arrests were made during the evaluation, and more than 100 people were charged following investigative searches.

2010 General Election campaign Apr 28th
The technology can scan faces in moving or still CCTV images and match them against police watch-lists (PA)

Overall, the study found that AFR can help police identify suspects “where they would probably not otherwise have been able to do so”, but “considerable investment and changes to police operating procedures” were needed to achieve consistent results.

Academics emphasised the system should not be seen as a fully automated tool, but as a technology to assist human operators.

They concluded that lighting, weather and crowd flows all affected the technology’s performance.

Professor Martin Innes, who led the evaluation, said: “There is increasing public and political awareness of the pressures that the police are under to try and prevent and solve crime.

“Technologies such as AFR are being proposed as having an important role to play in these efforts.

“What we have tried to do with this research is provide an evidence-based and balanced account of the benefits, costs and challenges associated with integrating AFR into day-to-day policing.”

Deputy Chief Constable Richard Lewis said South Wales Police had “learned much” during the evaluation period.

“The report provides a balanced perspective of our use of the technology and hopefully it will help to demystify some of the misunderstandings and misinformation that have proliferated across the press,” he added.

“South Wales Police remains committed to the continuous use of the technology in a proportionate and lawful way to protect the public, whilst also remaining open and transparent about how and when we use it.”

Chris Price