CyberHappenings logo

Track cybersecurity events as they unfold. Sourced timelines. Filter, sort, and browse. Fast, privacy‑respecting. No invasive ads, no tracking.

UK ICO Demands Clarity on Racial Bias in Facial Recognition Technology

First reported
Last updated
1 unique sources, 1 articles

Summary

Hide ▲

The UK's Information Commissioner's Office (ICO) has demanded urgent clarity from the Home Office regarding racial bias in the retrospective facial recognition (RFR) technology used by police. A report by the National Physical Laboratory (NPL) revealed significant false positive rates for Asian and black subjects compared to white subjects. The ICO expressed disappointment that it was not previously informed about these biases, despite regular engagement with the Home Office. The Home Office has purchased a new algorithm to address the issue, which is set for operational testing early next year.

Timeline

  1. 08.12.2025 12:30 1 articles · 23h ago

    UK ICO Demands Clarity on Racial Bias in Facial Recognition Technology

    The UK's Information Commissioner's Office (ICO) has demanded urgent clarity from the Home Office regarding racial bias in the retrospective facial recognition (RFR) technology used by police. A report by the National Physical Laboratory (NPL) revealed significant false positive rates for Asian and black subjects compared to white subjects. The ICO expressed disappointment that it was not previously informed about these biases, despite regular engagement with the Home Office. The Home Office has purchased a new algorithm to address the issue, which is set for operational testing early next year.

    Show sources

Information Snippets