pageview
Banner Default Image

London's King's Cross facial recognition cams under regulator lens

over 5 years ago by Lucy Cinder

London's King's Cross facial recognition cams under regulator lens

Cyber Security

The UK Information Commissioner's Office (ICO) is looking into the facial-recognition system installed by real estate developer Argent LLP in London’s King's Cross area, media reports heated up the debate on illegal snooping.

The ICO has announced that it will inspect how the system placed in the area is operated in order to make sure that no data protection laws were being violated.

The Financial Times reported first about the use of live face-scanning system across the 67-acre site around King's Cross station in London.

"Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all. That is especially the case if it is done without people’s knowledge or understanding," said information commissioner Elizabeth Denham in an announcement.

"I remain deeply concerned about the growing use of facial recognition technology in public spaces, not only by law enforcement agencies but also increasingly by the private sector. My office and the judiciary are both independently considering the legal issues and whether the current framework has kept pace with emerging technologies and people’s expectations about how their most sensitive personal data is used," she added.

London mayor Sadiq Khan wrote to the developer enquiring about its use of facial-recognition systems, the BBC reported. 

"King’s Cross station’s use of facial recognition technology highlights the potential threat to personal privacy that today’s advanced technologies can bring. This technology is being used to track tens of thousands of people in one of the busiest stations in London.  Ultimately, it raises the question – how much surveillance is too much?" asked David Emm, principal security researcher at Kaspersky

The real estate developer has not disclosed the scope and term of the system or the use of the data collected. Media enquiries from the FT, the BBC, and the Register received a staple reply from the company PR.

"These cameras use a number of detection and tracking methods, including facial recognition, but also have sophisticated systems in place to protect the privacy of the general public," it said.

"Facial recognition and video surveillance are covered by a complex web of regulations which isn’t easy to navigate," said Tamara Quinn, partner at law firm Osborne Clarke.

"Under the GDPR, use of biometrics, such as facial recognition systems, is covered by stricter safeguard than ordinary personal data. For many companies, this means that they may need to get consent from every person scanned and prove that these individuals were fully informed and have given consent freely, without pressure or being penalised for not participating," she added.

"It is completely understandable that the general public will want to know why they are being monitored, and for what purposes," said Jason Sierra, sales director for Seven Technologies Group. "However, with facial recognition technology, they are not being monitored in the traditional sense. They are being detected and analysed but then ignored as a threat."

"This is a huge potential threat to people’s personal freedom, and it’s evident that no one has given consent to the capture of their personal features in this way. Kings Cross may insist that the collection of such data is being done purely for "public safety" and to give its customers the ‘best possible experience’, but it's unclear what benefit is offered by collecting such granular data," noted Emm.

The ICO investigation announcement comes on the heels of the Biostar 2 fiasco. The global biometrics services vendor, of which the London metropolitan police is a customer, had left data -- including data including fingerprints, facial recognition details, face photos of users, unencrypted usernames, passwords and personal details of employees -- unprotected.

Kings Cross also faces a similar issue, said Emm.

"Like all data, it will be of value and therefore desired by cybercriminals. So, if this data falls into the hands of the wrong people, the outcome could be catastrophic. At the end of the day, we can change our passwords, but we can’t change our facial or other physical features, and businesses should be transparent on exactly how this technology could be used before unleashing it on an unsuspecting public."

source scmagazineuk

Industry: Cyber Security

 
Banner Default Image

Latest Jobs