<
https://www.theguardian.com/technology/2026/feb/25/facial-recognition-error-prompts-police-to-arrest-asian-man-for-burglary-100-miles-away>
"Police arrested a man for a burglary in a city he had never visited after face
scanning software deployed across the UK confused him with another person of
south Asian heritage.
Alvi Choudhury, 26, a software engineer, was working at the home he shares with
his parents in Southampton in January when police knocked on his door,
handcuffed him and held him in custody for nearly 10 hours before releasing him
at 2am.
Thames Valley police had used automated facial recognition software which
matched him with footage of a suspect of a £3,000 burglary 100 miles away in
Milton Keynes, according to documents shared with the
Guardian by Liberty
Investigates.
But the CCTV footage showed a noticeably younger man with different features
apart from similar curly hair, said Choudhury, who was left confused about why
he had been arrested.
“I was very angry, because the kid looked about 10 years younger than me,” said
Choudhury, who wears a beard. “Everything was different. Skin was lighter.
Suspect looked 18 years old. His nose was bigger. He had no facial hair. His
eyes were different. His lips were smaller than mine.
“I just assumed that the investigative officer saw that I was a brown person
with curly hair and decided to arrest me.”
UK police forces use an algorithm procured by the Home Office from Cognitec, a
German company. It runs about 25,000 monthly searches against around 19m police
mugshots held on the UK-wide police national database. Facial matches should be
treated as intelligence, not fact, according to the National Police Chiefs’
Council. Thames Valley police said the decision to arrest Choudhury was made
after a human visual assessment as well.
But the technology was revealed in December to produce a far higher rate of
false positives for black (5.5 %) and Asian (4.0 %) faces than for white faces
(0.04 %) at certain settings, according to Home Office commissioned research.
Police and crime commissioners warned of “concerning in-built bias”, and said
that while “there is no evidence of adverse impact in any individual case, that
is more by luck than design”."
Via Susan ****
Cheers,
*** Xanni ***
--
mailto:xanni@xanadu.net Andrew Pam
http://xanadu.com.au/ Chief Scientist, Xanadu
https://glasswings.com.au/ Partner, Glass Wings
https://sericyb.com.au/ Manager, Serious Cybernetics