James Clayton BBC Newsnight
3 hours ago
Sarah wanted some chocolate one day, so she popped into Home Bargains.
“Within a minute a store employee approached me and said, 'You're a thief. Please leave the store.'”
Sarah, who wishes to remain anonymous, was falsely accused after being flagged by a facial recognition system called Facewatch.
She said she was escorted out of the store after her bags were searched and has been banned from all stores that use the technology.
“I cried the whole way home…I thought, 'Oh my God, will my life ever be the same? Will I be seen as a shoplifter when I've never stolen anything?'”
Facewatch later wrote to Sarah, acknowledging a mistake.
Facewatch is being used to identify shoplifters in a number of UK stores, including Budgens, Sports Direct and Costcutter.
The company declined to comment to the BBC about Sarah's story but said its technology helps prevent crime and protect frontline workers. Home Bargains also declined to comment.
Retailers aren't the only ones turning to this technology.
On a hot and humid day in Bethnal Green, east London, we accompanied the police as they parked their modified white van on a high street.
A camera mounted on the roof captured images of thousands of people's faces.
If they match someone on a police watch list, officers will speak to them and potentially make an arrest.
An unflattering reference to the technology likens the process to a supermarket checkout, with your face becoming a barcode.
Image caption, Police facial recognition car cameras can capture thousands of images
On the day we were filming, the Metropolitan Police announced they had made six arrests with the help of technicians.
They included two people who breached the conditions of a Sexual Harm Prevention Order, a man wanted on suspicion of grievous bodily harm and someone wanted on suspicion of assaulting a police officer.
Lindsay Chiswick, the Metropolitan police's head of intelligence, told the BBC the speed of the technology had been extremely helpful.
“It takes less than a second for the technology to create a biometric image of a person's face, match it against a bespoke watch list and automatically remove it if there's no match.”
The BBC spoke to several people approached by police who confirmed they had been correctly identified by the system, which has resulted in 192 arrests so far this year.
But human rights groups are concerned its accuracy has yet to be fully established, pointing to cases like that of Sean Thompson.
Thompson, who works for youth support group Street Fathers, said he didn't think much of it when he walked past a white van near London Bridge in February.
However, within seconds police approached him and told him he was a wanted criminal.
“Then I felt a gentle push on my shoulder and was told that I was needed at that time.”
Image caption: Sean Thompson says he was the victim of a misidentification
He was required to be fingerprinted and detained for 20 minutes before being released after providing a copy of his passport.
However, that was a misidentification.
“It felt like an intrusion…I was treated as guilty until proven innocent,” he said.
The BBC understands the mistake may have been due to a family resemblance. The Metropolitan police declined to comment.
“Digital lineup”
Big Brother Watch director Silky Carlo has filmed police deployment of facial recognition systems on numerous occasions, and was there the night Sean Thompson was taken into custody by police.
“My experience observing live facial recognition over the years has been [is that] “Most of the general public doesn't actually know what live facial recognition is,” she says.
A person's face, once scanned, effectively becomes part of a digital police mugshot list, she said.
“If a match alert is triggered, police may arrive, detain them, question them and ask them to prove their innocence.”
Police use of facial recognition technology is expanding.
Between 2020 and 2022, the Metropolitan Police used live facial recognition nine times. The following year, that number rose to 23 times.
In 2024 it has already been used 67 times and the direction to go is clear.
Champions say misidentifications are rare.
According to the Metropolitan police, around one in every 33,000 people who walk in front of the cameras is misidentified.
But when they are actually flagged, the number of errors is much higher: So far this year, 1 in 40 alerts has been a false positive.
Michael Birtwistle, research director at the Ada Lovelace Institute Research Group, believes the technology is so new that the law has not yet caught up.
“It's really a lawless area at the moment, which creates legal uncertainty as to whether current use is illegal,” he said.
In Bethnal Green, some people the BBC spoke to were worried about the use of the technology, but the majority were supportive if it could help fight crime.
This raises another question about this technology: Will it help in the long run?
As people become accustomed to seeing white vans parked on busy thoroughfares, will people who know they are being pursued by police notice the cameras and start to avoid them? Will shoplifters hide their faces?
Carlo says society needs to be vigilant to prevent facial recognition from becoming the norm.
“Once police are able to say this is OK, this is something they can do routinely, why not incorporate it into a fixed camera network?”
This is the dystopian future civil rights activists fear most: a Chinese-style mass surveillance state.
Advocates reject such dire predictions as exaggerated.
And it's clear that many people would be happy to put up with having their faces scanned if it would make their cities safer.