LONDON: On a grey, cloudy morning in December, London police deployed a state-of-the-art AI powered camera near the railway station in the suburb of Croydon and quietly scanned the faces of the unsuspecting passersby.
The use of live facial recognition (LFR) technology – which creates biometric facial signatures before instantaneously running them through a watchlist of suspects – led to 10 arrests for crimes including threats to kill, bank fraud, theft and possession of a crossbow.
The technology, which was used at the British Grand Prix in July and at King Charles III’s coronation in May, has proved so effective in trials that the UK government wants it used more. “Developing facial recognition as a crime fighting tool is a high priority,” policing minister Chris Philp said in October, adding that the technology has “great potential”.
But the call to expedite its roll-out has outraged some parliamentarians, who want the government’s privacy regulator to take “assertive, regulatory action” to prevent its abuse. “Facial recognition surveillance involves the processing, en masse, of the sensitive biometric data of huge numbers of people – often without their knowledge,” they wrote in a letter.
MPs said the use of the technology by private companies, meanwhile, represented a “radical transfer of power” from ordinary people to companies in private spaces.
Lawmakers allege that false matches by the technology, which is yet to be debated in parliament, have led to more than 65 wrongful interventions by the police.
The use of live facial recognition (LFR) technology – which creates biometric facial signatures before instantaneously running them through a watchlist of suspects – led to 10 arrests for crimes including threats to kill, bank fraud, theft and possession of a crossbow.
The technology, which was used at the British Grand Prix in July and at King Charles III’s coronation in May, has proved so effective in trials that the UK government wants it used more. “Developing facial recognition as a crime fighting tool is a high priority,” policing minister Chris Philp said in October, adding that the technology has “great potential”.
But the call to expedite its roll-out has outraged some parliamentarians, who want the government’s privacy regulator to take “assertive, regulatory action” to prevent its abuse. “Facial recognition surveillance involves the processing, en masse, of the sensitive biometric data of huge numbers of people – often without their knowledge,” they wrote in a letter.
MPs said the use of the technology by private companies, meanwhile, represented a “radical transfer of power” from ordinary people to companies in private spaces.
Lawmakers allege that false matches by the technology, which is yet to be debated in parliament, have led to more than 65 wrongful interventions by the police.