MPs have called upon police forces to suspend facial recognition trials amid concerns over privacy and the potential for bias.In a report published this week, the House of Commons Science and Technology committee voiced serious concerns over the accuracy of the invasive technology and raised questions over bias – an issue which has been brought up repeatedly in discussions over the technology’s use. The publication of the report comes just one week after Home Secretary Sajid Javid announced he would back police trials of facial recognition technology. Javid did, however, concede that in the longterm the technology’s use would require strict legislation.
Recommendations from the committee include a halt on trials of facial recognition systems to establish how they can be used appropriately and without risk to the public. In addition, the report suggests that stronger regulatory frameworks must be implemented before the technology can be safely deployed in public spaces through the UK.
“Automatic facial recognition should not be deployed until concerns over the technology’s effectiveness and potential bias have been fully resolved,” the report said.
- Artificial intelligence: Ethics, regulation and public perception
- UK boards failing to address cybersecurity risks
- Fintech for good: Meet Scotland’s ethical fintech companies
Police forces are failing in their duty to edit custody image databases and remove pictures of unconvicted persons, the committee found. As such, MPs have warned that innocent people may be included in facial recognition watch lists which police draw upon to stop people in public spaces.
“It is unclear whether police forces are unaware of the requirement to review custody images every six years, or if they are simply ‘struggling to comply’,” the report pondered.
“What is clear, however, is that they have not been afforded and earmarked resources to assist with the manual review and weeding process.”Concerns over the use of custody images were raised last year, the committee insisted. However, there has been little progress in this regard from the Home Office.the Biometrics and Forensics Ethics Group, which claimed that facial recognition systems could create in “inaccurate results” due to a lack of diversity within datasets that police forces or other authorities use to train algorithms.
In February this year, the advisory group said: “If certain types of faces – for example, black Asian and ethnic minority faces or female faces – are underrepresented in live facial recognition training datasets, then this bias will feed forward into the use of the technology by human operators.”
Police officers could begin to “defer to the algorithm’s decision” without following double-check procedures, the group added, procedures that are in place to ensure matches are confirmed before any action is taken by officers.