An innocent man is suing the police after facial recognition software confused him for a suspected burglar in a town 100 miles away.
Alvi Choudhury, 26, was working from home in Southampton at the house he shares with his parents when officers arrested him and held him in custody for 10 hours before he was finally freed with no further action at 2am.
Automated facial recognition software, used by Thames Valley Police, had matched him to a £3,000 burglary suspect in Milton Keynes.
But Mr Choudhury claims CCTV footage of the alleged crime featured a younger man who looked different to him in every way except for their curly hair.
He is now seeking damages from Thames Valley Police and Hampshire Constabulary, which took him into custody, claiming his neighbours witnessed his arrest in January which left his father very anxious and left him unable to work the following day.
Mr Choudhury was eventually freed with no further action at 2am.
Matches made using facial recognition technology require the suspect’s mugshot to be stored on the police system.
Mr Choudhury’s face was on the database after he was wrongly arrested in 2021 having been attacked during a night out in Portsmouth.
UK police forces use a German algorithm procured by the Home Office to trawl through around 19million mugshots on the national database.

Innocent man Alvi Choudhury is suing the police after facial recognition software confused him for a suspected burglar in a town 100 miles away
They run around 25,000 searches a month and, according to the National Police Chiefs’ Council, the matches should be treated as intelligence and not fact.
Mr Choudhury said the suspect his face was matched with looked around 10 years younger than him, with lighter skin, a larger nose, smaller lips and no facial hair.
Thames Valley Police insisted the decision to arrest Mr Choudhury was made following a human visual assessment as well as the technology match.
But the force admitted the mistake ‘may have been the result of bias within facial recognition technology’.
The facial recognition software is far from fool-proof and Home Office research revealed in December that matches for black faces are false positives 5.5 per cent of the time, while Asian face matches end up being false positives in 4 per cent of cases.
Both figures are far higher than the 0.04 per cent of white face matches which result in false positives.
An officer told Mr Choudhury: ‘As the use of facial recognition is already subject to review at a strategic level, I do not feel the need to raise this issue as part of wider organisational learning.’
The innocent man said he was taken into custody despite the differences between his face and the one in the clip.
He added that he also presented evidence of work meetings in Southampton on the day of the alleged crime 115 miles away.
When he quizzed officers at Hampshire police station about whether he looked at all like the man in the video, they erupted into laughter, he claimed.
Mr Choudhury added that once Thames Valley cops turned up to interview him they realised that the suspect in the clip was not him.

Ian Clayton said he was told to leave a Home Bargains shop in Chester after facial recognition technology claimed he had stolen items
He is now concerned that having a second mugshot on the system following the incident could lead to more wrongful arrests.
Mr Choudhury added that the saga makes him look ‘dodgier’ which could affect him negatively given he sometimes needs security clearance to work for government clients in his work as a software engineer.
Police and crime commissioners have previously warned of ‘concerning in-built bias’ and insisted that while ‘there is no evidence of adverse impact in any individual case, that is more by luck than design’.
A Thames Valley Police spokesperson said: ‘While we apologise for the distress caused to the complainant in this case, their arrest was based on the investigating officers’ own visual assessment that the individual matched the suspect in CCTV footage following a retrospective facial recognition match, and was not influenced by racial profiling.
‘To confirm, retrospective facial recognition technology did initially provide intelligence, but did not determine the arrest.
‘Although later enquiries eliminated the individual from the investigation, this does not make the arrest unlawful.
‘We continue to use policing tools responsibly while striving to improve and build trust in our communities.’
Mr Choudhury’s lawyer, Iain Gould of DPP Law said: ‘This isn’t Policing by consent, and nor is it policing by common sense.
‘In this case, the police have been playing AI lottery with people’s lives, and Alvi has been wrongly arrested; now the police must pay the price for that.’
Last month South Wales Police paid damages to a black man who was identified as a possible match to a stalking suspect despite being 32nd on the list of suggested matches on the facial recognition technology.
It came after an innocent grandfather was wrongly accused of being a thief after facial recognition technology suggested he had stolen items.
Ian Clayton, 67, said he was told to leave a Chester Home Bargains shop after the technology claimed he was involved in a theft he said had nothing to do with him.
After being asked to leave, Mr Clayton contacted security company Facewatch, which sent him his photo with words saying he had put items into a bag and stolen them.
The facial recognition technology flags suspicious movements such as goods being stuffed into bags and sends a message to workers with footage and where the behaviour is happening.
It also sends an alert to staff if a so-called subject of interest on a watchlist enters a shop.
But Facewatch admitted Mr Clayton should not have appeared on its system, saying it had permanently removed his image and ‘the associated record’.
Hampshire Constabulary declined to comment.


