ADVERTISEMENT

Biometric Data Breach Could Link Your Face to Illegal Activities

The nature of how organisations capture and store the public’s biometric data has come under renewed scrutiny.

Biometric Data Breach Could Link Your Face to Illegal Activities
An exhibitor tests a Dermalog LF-10 biometric fingerprint scanner at the Dermalog Identification Systems GmbH pavilion at the CeBIT 2017 tech fair in Hannover, Germany. (Photographer: Krisztian Bocsi/Bloomberg)

(Bloomberg) -- The nature of how organizations capture and store the public’s biometric data, such as fingerprints and images of faces, came under renewed scrutiny this week by security experts and regulators.

Britain’s Information Commissioner’s Office said it was opening an investigation into the use of facial-recognition camera technology at London’s Kings Cross development. It followed revelations on Wednesday that millions of pieces of personal biometric data may have leaked from a popular security service.

Biometric Data Breach Could Link Your Face to Illegal Activities

“Scanning people’s faces as they lawfully go about their daily lives, in order to identify them, is a potential threat to privacy that should concern us all,” Elizabeth Denham, the U.K.’s Information Commissioner, said in a statement Thursday.

In addition to seeking information about how the technology will be used, she said the ICO “will also inspect the system and its operation on-site to assess whether or not it complies with data protection law.”

Kings Cross said earlier in the week that it had “sophisticated systems in place to protect the privacy of the general public.”

Privacy is only one worry, however. Although the Biostar 2 data breach was unconnected to Kings Cross or the ICO’s concern, security experts said both events served as an important reminder of the risks associated with the use of the public’s imagery.

“Security is only as strong as the weakest link,” Michela Menting, research director at ABI Research, said in an interview. “Far too many companies are lacking in their risk analysis and security evaluations.”

She said the growing trend of so-called deepfakes -- audio and video fabricated to depict people saying or doing things they never said or did -- was a particular concern in this regard.

“The key ingredient in truly credible deepfakes is having a lot of data on the subject, and notably video of a person in any number of different facial expressions,” Menting said. “One can imagine that leaks of facial recognition information can help to build better databases.”

Once lost, data such as digital fingerprints can alter a person’s life forever, as it’s nearly impossible to retrieve that information -- or replace one’s fingers as easily as changing a password.

To contact the reporter on this story: Ali Ingersoll in London at aingersoll1@bloomberg.net

To contact the editors responsible for this story: Giles Turner at gturner35@bloomberg.net, Nate Lanxon, Andrew Pollack

©2019 Bloomberg L.P.