Civil society groups, including the Runneymede Trust and Amnesty International, have raised concerns about the role of automated policing technologies in perpetuating racial disparities in the UK. In a joint submission to the United Nations Committee on the Elimination of Racial Discrimination, they highlighted the detrimental effects of AI and facial-recognition technologies on people of color, contributing to a rollback in their civil and political rights.
The report points to systemic issues in the criminal justice system, health, education, employment, and immigration, with little progress being made to address racial inequalities.
The report also emphasized how these technologies, such as live-facial recognition (LFR) and Automated Number Plate Recognition (ANPR), often misidentify people of color, leading to human rights violations.
Despite these inaccuracies, the UK government has allowed the continued use of LFR, which has increased significantly in recent years. The case of Chris Kaba, a 23-year-old Black man killed after his car was flagged by ANPR, illustrates the fatal consequences of such automated policing systems. The report stresses the need for transparency and oversight in their deployment.
Further, the report highlights the discriminatory impact of the Met’s Gangs Matrix database, which racially profiled young Black men based on their social networks and interests.
Though this database was eventually condemned and scrapped, concerns remain over its potential replacements. Civil society groups are calling for a comprehensive review of gang databases, assessing their compliance with human rights laws and examining their effectiveness and fairness.
In response to these issues, the civil society groups recommended prohibiting the use of predictive and profiling systems in law enforcement. They also urged the UK government to ensure transparency in AI deployments by police and migration authorities and to implement legal safeguards to protect human rights.
Additionally, they called for an inquiry into police gang databases to determine if broader reforms are necessary to address racial biases in policing.
The UK government has been considering these concerns, with officials affirming their commitment to addressing racism, especially in light of recent violence and discrimination.
However, civil society organizations remain critical, with some urging the government to ban AI-powered predictive policing and biometric surveillance due to their disproportionate targeting of marginalized communities. They argue that without strong regulation, these technologies will continue to exacerbate existing inequalities.
Ongoing concerns about the use of biometric and facial-recognition technologies in policing were echoed by the outgoing biometrics and surveillance camera commissioner, Fraser Sampson.
He questioned the effectiveness of these technologies and warned about the potential for the UK to become a surveillance state without proper oversight. Sampson also raised alarms about police forces retaining biometric data unlawfully, an issue that has persisted despite legal rulings calling for the deletion of such data.