UK ICO Demands Clarity on Racial Bias in Facial Recognition Technology
Summary
Hide ▲
Show ▼
The UK's Information Commissioner's Office (ICO) has demanded urgent clarity from the Home Office regarding racial bias in the retrospective facial recognition (RFR) technology used by police. A report by the National Physical Laboratory (NPL) revealed significant false positive rates for Asian and black subjects compared to white subjects. The ICO expressed disappointment that it was not previously informed about these biases, despite regular engagement with the Home Office. The Home Office has purchased a new algorithm to address the issue, which is set for operational testing early next year.
Timeline
-
08.12.2025 12:30 1 articles · 23h ago
UK ICO Demands Clarity on Racial Bias in Facial Recognition Technology
The UK's Information Commissioner's Office (ICO) has demanded urgent clarity from the Home Office regarding racial bias in the retrospective facial recognition (RFR) technology used by police. A report by the National Physical Laboratory (NPL) revealed significant false positive rates for Asian and black subjects compared to white subjects. The ICO expressed disappointment that it was not previously informed about these biases, despite regular engagement with the Home Office. The Home Office has purchased a new algorithm to address the issue, which is set for operational testing early next year.
Show sources
- UK ICO Demands “Urgent Clarity” on Facial Recognition Bias Claims — www.infosecurity-magazine.com — 08.12.2025 12:30
Information Snippets
-
The NPL report tested the Cognitec FaceVACS-DBScan ID v5.5 algorithm used in RFR technology.
First reported: 08.12.2025 12:301 source, 1 articleShow sources
- UK ICO Demands “Urgent Clarity” on Facial Recognition Bias Claims — www.infosecurity-magazine.com — 08.12.2025 12:30
-
False positive rates for white subjects were 0.04%, compared to 4% for Asian subjects and 5.5% for black subjects.
First reported: 08.12.2025 12:301 source, 1 articleShow sources
- UK ICO Demands “Urgent Clarity” on Facial Recognition Bias Claims — www.infosecurity-magazine.com — 08.12.2025 12:30
-
The false positive identification rate (FPIR) for black male subjects was 0.4%, while for black female subjects it was 9.9%.
First reported: 08.12.2025 12:301 source, 1 articleShow sources
- UK ICO Demands “Urgent Clarity” on Facial Recognition Bias Claims — www.infosecurity-magazine.com — 08.12.2025 12:30
-
The Home Office has purchased a new algorithm that will be operationally tested early next year.
First reported: 08.12.2025 12:301 source, 1 articleShow sources
- UK ICO Demands “Urgent Clarity” on Facial Recognition Bias Claims — www.infosecurity-magazine.com — 08.12.2025 12:30
-
The Association of Police and Crime Commissioners (APCC) expressed concerns about transparency and called for robust and independent assessment of the technology.
First reported: 08.12.2025 12:301 source, 1 articleShow sources
- UK ICO Demands “Urgent Clarity” on Facial Recognition Bias Claims — www.infosecurity-magazine.com — 08.12.2025 12:30