Surrey Police will continue to use AI-powered surveillance vans to scan thousands of people’s faces in public locations despite fears over ethnic bias, said councillors calling for their use to be put on hold.
The Home Office is funding the use of new artificial intelligence-powered cameras in Surrey to scan the faces of anybody who crosses their path.
On November 26 last year, the police brought the technology to Woking and recorded 7,686 people over a five-hour recording period – to cross reference them against known suspects.
The force has said the system was safe following a 2023 study that found previous bias in the system had been coded out – but more recent testing by the National Physical Laboratory suggests false positives are still happening too frequently among ethnic minorities.
The report read: “At the operational setting used by police, the testing identified that in a limited set of circumstances the algorithm is more likely to incorrectly include some demographic groups in its search results.”
The Home Office has said will act on the findings and that a “new algorithm has been procured and independently tested, which can be used at settings with no significant demographic variation in performance.
The new algorithm is due to be operationally tested early next year and will be subject to evaluation.”
It has led to calls from Woking Borough Councillors for the system to be mothballed until it has been thoroughly tested – something which Surrey Police has so far refused to do.
Speaking at a Tuesday, December 20, meeting of the borough’s communities and housing scrutiny committee, Surrey Police Chief Inspector Andy Hill described the system as having the support of the Home Office and said it was a valuable tool “to keep Surrey safe.”
He said: “It’s a safe place but if we’ve got the opportunity to use the latest technology then we want to make sure that we are doing that.” Early versions of the software created false alerts at a disproportionate rate among ethnic groups.
In London the Met Police is facing a High Court challenge after an anti-knife crime activist said he was misidentified and threatened with arrest. Surrey Police said it was confident in the system and that people are only arrested under suspicion, it does not mean guilt.
The technology is used in high footfall areas and is said to have a chilling effect on crime with notable falls in the following weeks after its deployment.
Any images that do not match those on its wanted list are instantly deleted. Matched faces are deleted at the end of the day. If the system thinks it has found a face on the police’s wanted database officers at the scene are notified and it is up to them how to proceed.
Committee chair Cllr Tom Bonsundy-O’Bryan said: “I have very serious concerns about the proportionality of this. Are the pros, which feels pretty limited in one of the safest town centres in the UK, worth the cost of 7,000 free citizens having their faces scanned by this technology?
“This doesn’t feel like targeted policing, it doesn’t feel like proportionate policing. It starts to feel like something more Orwellian in a kind of mass surveillance. With everything that you’ve said, all the facts about data not being stored, data not being used to train models
“It still feels like an overreach into people’s privacy, people’s rights fundamentally. Is there a point when it’s not proportionate, how many faces should we scan? To me it already feels vastly disproportionate.”
Chief Insp Hill said: “We are in the view that it is proportionate and it is appropriate and it is technology available to us. We don’t feel like we are reaching into a technology space. The van is funded by the Home Office, it’s why we want to continue using it but also keep it under review.”



-of-West-Molesey-was-sentenced-to-13-years-in-prison.png?width=209&height=140&crop=209:145,smart&quality=75)

Comments
This article has no comments yet. Be the first to leave a comment.