Saturday, 26 May 2018
Latest news
Main » Police must gain public trust on facial recognition tech

Police must gain public trust on facial recognition tech

16 May 2018

One particular development is the use of biometric data, including databases of facial images, in conjunction with Automatic Facial Recognition Technology.

Some in policing say facial recognition is the next big leap in law enforcement, akin to the revolution brought about by advances in DNA analysis.

Biometric data, including databases of facial images, in conjunction with automatic facial recognition technology (FRT) has been available for some time "but the ability of the technology to be linked to different online databases, with mobile and fixed camera systems, in real time, greatly increases its reach and impact" said Elizabeth Denham.

On 31 occasions police followed up the system saying it had spotted people of concern, only to find they had in fact stopped innocent people and the identifications were false. Potential matches are then flagged, allowing police to investigate further.

South Wales Police have admitted they keep hold of images of innocent people wrongly identified by their facial recognition cameras for a year, meaning that every innocent person wrongly identified at all these events (over 2,400 people in South Wales Police's case) has their image on a police database - and these people are completely unaware about it.

The system, though, hasn't had much success in positive identifications either: the report showed there have been just two accurate matches, and neither person was a criminal.

If the concept of this dystopian and authoritarian policing tool turning us all into walking ID cards wasn't enough on its own, there are huge problems with both the technology and the police's intrusive and oppressive use of it. "All alerts against the watch list are deleted after 30 days".

Its false positive rate is 91 per cent, and the matches led to 15 arrests - equivalent to 0.005 per cent of matches. The HD cameras detect all the faces in a crowd and compare them with existing police photographs, including mug shots.

South Wales Police defended its use of the facial recognition software, insisting that the system has improved over time. "When we first deployed and we were learning how to use it. some of the digital images we used weren't of sufficient quality", Deputy Chief Constable Richard Lewis told the BBC.

"If an incorrect match has been made, officers will explain to the individual what has happened and invite them to see the equipment along with providing them with a Fair Processing Notice".

Typically people on this list have mental health issues, and Big Brother Watch expressed concern that the police said there had not been prior consultation with mental health professionals about cross-matching against people in this database.

Underlying the concerns about the poor accuracy of the kit are complaints about a lack of clear oversight - an issue that has been raised by a number of activists, politicians and independent commissioners in related areas.

New data protection rules are about to come into force in the United Kingdom, requiring organizations to assess the risks of new technologies, particularly when biometric data is involved, and also to provide a data protection impact assessment to Denham's office in some circumstances.

The Home Office said that it plans to publish its biometrics strategy in June, and it "continues to support police to respond to changing criminal activity and new demands". The force said the images were only stored as part of an academic evaluation for UCL, and not for any policing goal.

Big Brother Watch is taking the report to Parliament today to launch a campaign calling for police to stop using the controversial technology, branded by the group as "dangerous and inaccurate".

Police must gain public trust on facial recognition tech