The police have defended their use of AFR, but have not publicly commented on the case brought against them. The outcome of this trial could potentially impact the future regulation and use of the technology.Speaking to BBC News, Bridges said: “I popped out of the office to do a bit of Christmas shopping and on the main pedestrian shopping street in Cardiff, there was a police van.
“By the time I was close enough to see the words ‘automatic facial recognition’ on the van, I had already had my data captured by it. That struck me as quite a fundamental invasion of my privacy.”
Bridges had his image captured a second time while attending a peaceful protest against the arms trade. He argues that AFR breaches his human right to privacy, data protection laws and equality laws.
The new patent pairs facial recognition with products from Ring, a doorbell camera company that Amazon bought: the application describes a system that the police can use to match the faces of people walking by a doorbell camera with a photo database of persons they deem “suspicious.” Likewise, homeowners can also add photos of “suspicious” people into the system and then the doorbell’s facial recognition program will scan anyone passing their home.
The use of fingerprints and DNA by police is strictly regulated, but there is a lack of such measures for the use of other forms of biometric data. Currently, there is little governance on how the data is gathered or managed by police and government.Civil rights group Liberty, which is representing Bridges, has said the use of the tech is equivalent to the unregulated taking of DNA and fingerprints without consent. The group asserts that if the tech breaches human rights then it should not be used.
AFR tech is capable of scanning large numbers of people in public spaces such as shopping centres or football crowds without their knowledge. The captured data can then be compared to the images on the police’s ‘watch lists’ to see if they match.
- UK Police Use of Facial Recognition to be Investigated by ICO
- Pig Facial Recognition Tool Could Help Improve Animal Wellbeing
- Uber Sued Over ‘Racist’ Facial Recognition Software
While it is acknowledged that the tech can be used to help prevent serious crimes, such as acts of terrorism, Liberty says that police are also deploying the tech for petty offences like capturing pickpockets.
Liberty has raised a number of concerns including where the watch list images are coming from and the fact the police have not ruled out scrapping social media platforms for watch list images.
Moreover, other civil liberty groups say that studies have shown AFR has a high-rate of misidentification and discriminates against ethnic minorities, in particular women of colour.
Megan Goulding, a lawyer for Liberty, said: “If you are a woman or from an ethnic minority and you walk past the camera, you are more likely to be identified as someone on a watch list, even if you are not. This means you are more likely to be stopped and interrogated by the police.
WCSO Via public records requests, CNET reviewed seven sheriff's office reports that showed facial recognition being put to use in making an arrest. Deputy Jeff Talbot, Washington County's public information officer, said WCSO has made arrests on "crimes on multiple levels" using Rekognition, not just minor offenses.
“This is another tool by which social bias will be entrenched and communities who are already over-policed simply get over-policed further.” The group says the frequency of false-positives has the potential to change the nature of public spaces.
Last week, due to rising concerns over the reliability and infringement of people’s liberty and privacy, San Francisco banned the use of AFR, making it the first city in the US to do so. The Home Office, information commissioner and surveillance camera commissioner have become involved in the case, showing a high level of interest and concern in relation to the parameters of the legal use of AFR.