The House of Lords has raised significant concerns about the use of live facial recognition (LFR) technology by retailers and is calling for new legislation to ensure its ethical and safe application by private companies. This follows an inquiry by the Lords’ Justice and Home Affairs Committee (JHAC), which began in May 2024 with a focus on combating shoplifting.
The inquiry also explored the use of both live and retrospective facial recognition (RFR) technologies by retailers and law enforcement in addressing retail crime.
The committee’s concerns primarily relate to the potential misuse of facial recognition technology by private retailers. One major issue is that retailers often collaborate to build local databases or watchlists of known shoplifters, without clear criteria or criminal thresholds.
This means individuals could be added to these lists without their knowledge, potentially leading to being banned from multiple stores or even entire regions. The Lords expressed worry about the lack of transparency in these decisions and the absence of recourse for those wrongfully included in the databases, especially due to potential misidentifications.
In addition to concerns about privacy and fairness, the committee noted risks associated with the technology, including possible violations of the General Data Protection Regulation (GDPR) and the potential for discrimination.
Studies have shown that facial recognition systems can be less accurate for people with darker skin tones, raising concerns about bias in algorithmic decisions. These issues have led to calls for clear legal frameworks to govern the use of such technologies, ensuring that individuals’ rights and freedoms are protected.
The committee also reviewed the use of RFR by law enforcement, including its role in helping retailers report crimes. While retailers like the Co-op Group compile evidence packs to support police investigations, including CCTV and body camera footage, it is not yet standard practice for police to cross-check this footage with national databases.
In 2023, the UK government launched Project Pegasus, which involves major retailers sharing footage with the police to identify organized retail crime suspects. The committee welcomed this initiative and recommended continued funding for the project.
The JHAC’s inquiry aligns with broader concerns regarding the use of algorithmic technologies in policing. In previous investigations, the committee found that police are deploying facial recognition and other predictive crime technologies without sufficient scrutiny or understanding of their effectiveness.
The lack of transparency and oversight has led to the characterization of the current situation as a “Wild West” of unchecked technology use. The committee believes that stronger legal safeguards are necessary to prevent violations of privacy rights and ensure accountability.
Despite these concerns, the UK government has defended its use of facial recognition technology, asserting that it helps tackle rising shoplifting rates and improve public safety. A Home Office spokesperson highlighted that recent changes, such as removing the £200 threshold for low-value theft and making assaults on shop workers a criminal offense, show the government’s commitment to addressing retail crime.
However, both Parliament and civil society have called for new regulations to better govern the use of biometrics, and the JHAC has reiterated its belief that a balanced approach to regulation is necessary to prevent misuse while enabling innovation in crime prevention.