The 'face off' between biometrics and privacy

If Hollywood is any guide, privacy is likely to come off worst, suggests Coffin Mew's Guy Cartwright

In the 1997 film Face/Off, FBI agent Sean Archer (played by Nicholas Cage) has his face transplanted onto homicidal sociopath Castor Troy (played by John Travolta) while Troy is in a coma. Troy wakes from his coma, and has Archer's face transplanted onto his own. The rest of the film is about Archer trying to convince the FBI that he is not Troy.

Facial recognition technology might not have helped Archer in the above scenario, but perhaps fingerprinting or voice recognition might have done. Technological developments in biometrics technology continue to present exciting opportunities, both from a security perspective, but also in the fields of health and leisure.

But there are also risks associated with biometric technology and, as it starts to play an increasingly important role in how we interact as a society, leaders and decision makers must be mindful of the risks to ensure that organisations process and use biometric data in a responsible and ethical way.

Biometrics: a brief overview

Biometrics as a term means measurements and calculations (metrics) about the body (bio), used to identify and authenticate an individual.

Biometric technology works by capturing an image of a physical feature (e.g. your face or your fingerprint) that does not vary within a statistical limit.

The image is then digitally converted through the use of algorithms into a template that is electronically stored.

When the individual again presents themselves to that system, the ‘live' digital image of the body part is matched against the stored image, allowing the individual to be identified or authenticated.

Voice biometrics work by comparing a person's voice to a voiceprint stored on file.

This technology asks a fairly simply question: "Are you who you say you are?"

A quick look at the marketplace

Companies across various industries continue to invest heavily in biometric technology.

Microsoft recently filed an application in the US Patent and Trademark Office to include a fingerprint sensor in keyboards.

Apple filed an application in the US Patent and Trademark Office for vein recognition technology that could be used when facial recognition technology fails to authenticate the user.

The banking industry has been using biometrics for some time - Barclays, for example has been using voice recognition technology since 2016. And last month, the Royal Bank of Scotland announced that it was piloting a biometric bank card. RBS described the pilot as "the biggest development in card technology in recent years". The card will allow customers to verify a purchase using a fingerprint.

At the Consumer Technology Association 2019, Procter & Gamble reportedly showed a concept store where cameras recognised shoppers' faces and made shopping recommendations for them, as they shopped.

Voice recognition technology is also rapidly evolving and is no longer being used to just authenticate the user. A recent patent was issued to Amazon which would allow Alexa to recognise a range of user characteristics, including accent and emotional state.

And then there is CompanionMX, the App that claims to provide "a new window into human emotion, cognition, and behaviour" by using voice recognition technology. The app records voice features and phone meta-data indicative of four digital biomarkers correlated with symptoms of mental health.

Biometric technology undoubtedly offers some exciting possibilities.

Cause for concern?

Biometric technology's greatest asset - the fact that it is time invariant - is also its Achille's heel. Once the data is compromised, it's compromised forever, and no technology is infallible. Apple's supposedly secure Touch ID sensor was proven to be vulnerable to hacking through the use of fake fingerprints.

Voice biometric authentication is generally regarded as more secure than fingerprints as it relies on more than just the physical characteristics of a unique voiceprint to authenticate the user. Each voice has around 100 different characteristics.

But biometrics-based security does have a history of being copied. Users would then have to prove it wasn't them who (for example) accessed a compromised account. This could prove challenging, especially if the system is considered to be perfect.

Other concerns remain. Sensors could elicit additional data on medical history, for example, during a retina scan. If commercial organisations find out this information, should they have a duty to notify the individual? Would that individual want to know? Should commercial organisations be required to disclose biometric information to law enforcement agencies? These are all questions that will need to be answered before too long.

Conclusion

Biometric technology offers many exciting opportunities to improve lives, and investment in this area should be encouraged.

That said, biometric data is a profoundly different type of data to other types of personal data because it is integral to us as human beings - it defines who we are. Biometric technology will have to evolve as measures emerge that try to copy them.

Multimodal identification offers the most secure form of authentication and verification. However, as this article has outlined, biometric technology is now not just being used for the narrow purpose of user identification, and the privacy implications of this will need to be continually assessed.

Organisations must ensure that biometric data is processed responsibly and in compliance with the law. Most importantly, this normally includes obtaining user consent to process the data.

Data protection law is generally technology agnostic and, as it currently stands, does not set out particular technology requirements for processing this type of personal data. This area, however, is undoubtedly an area for privacy-enhancing techniques, including pseudonymisation, anonymisation and encryption.

The processing of biometric data is crying out for some additional guidance from regulators or, even better, a binding code of conduct.

Google, for example, has recently set-up its own advisory council to help it decide what is and isn't ethical as far as AI and machine learning is concerned. Maybe it will soon have to set-up another one to advise on biometrics?

Guy Cartwright is an associate solicitor - Commercial Services at law firm Coffin Mew