Police facial recognition trials failing

Lester Mason
May 15, 2018

In figures given to Big Brother Watch, South Wales Police said its technology had made 2,685 "matches" between May 2017 and March 2018 - but 2,451 were false alarms.

The UK's Information Commissioner has threatened to take legal action over the use of facial recognition in law enforcement if the police and government can not prove the technology is being deployed legally.

Police forces have used parts of this database to deploy facial recognition to police crowds at major events.

Big Brother Watch's report focused on the latter, which it said breaches human rights laws as it surveils people without their knowledge and might dissuade them from attending public events.

Big Brother Watch found that the system, used by the Met at the 2017 Notting Hill Carnival, was wrong 98pc of the time.

Wired revealed earlier this month that during the UEFA Champions League Final last June, there were 2,470 alerts of possible matches, 2,297 false positives and 173 accurate identifications. Because of the poor quality, it was identifying people wrongly.

"All alerts against the watch list are deleted after 30 days".

Two forces have been testing facial recognition cameras at public events in an effort to catch wanted criminals.

A member of staff from the human rights organisation Liberty who observed the Met Police's operation at Notting Hill Carnival past year claimed the technology led to at least 35 false positives, five people being unduly stopped and one wrongful arrest.


The group described this as a "chilling example of function creep" and an example of the risky effect it could have on the rights of marginalised people.

"But how facial recognition technology is used in public spaces can be particularly intrusive".

The software used by South Wales Police and the Metropolitan Police has not been tested for demographic accuracy, but in the United States concerns have been raised that facial recognition is less reliable for women and black people.

South Wales Police added that a "number of safeguards" stopped police taking action against innocent people. "Firstly, the operator in the van is able to see that the person identified in the picture is clearly not the same person, and it's literally disregarded at that point", Lewis said. She also welcomed the recent appointment of a National Police Chiefs Council (NPCC) lead for the governance of the use of FRT technology in public spaces.

"Officers can quickly establish if the person has been correctly or incorrectly matched by traditional policing methods, either by looking at the person or through a brief conversation", a spokesperson said.

"On a much smaller number of occasions, officers went and spoke to the individual. realised it wasn't them, and offered them the opportunity to come and see the van".

"Automated facial recognition technology is now used by United Kingdom police forces without a clear legal basis, oversight or governmental strategy", the group said.

Big Brother Watch said their systems had wrongly flagged thousands of innocent people. This means that they remain on the system unless a person asks for them to be removed.

In March, Williams said that because images can only be deleted manually, weeding out innocent people "will have significant costs and be hard to justify given the off-setting reductions forces would be required to find to fund it".

Other reports by Iphone Fresh

Discuss This Article

FOLLOW OUR NEWSPAPER