Saving face: Five key considerations when navigating the use of facial recognition technology
Facial recognition brings legal and reputational risks
You must remember these key steps when considering facial recognition technology.
If you travelled abroad this summer, you might not have given much thought to the technology in use at border control on your return. Whether it's being used at airport checkpoints, for policing, or even monitoring employee attendance, facial recognition technology (FRT) is on the rise. It's no surprise, then, that we're seeing a concomitant increase in client enquiries about the use of such technology.
FRT and other forms of biometric recognition technology are not without legal and reputational risk. They involve processing personal data and AI, a form of technology that is subject to increasing scrutiny. To help you avoid falling foul of the rules, we're going to explore five key considerations when navigating FRT and similar technologies.
1. What is the purpose of using the technology?
To begin with, ask yourself (and any other stakeholders within your organisation) what business need you're seeking to address using such technology. Are you adopting FRT for security purposes or to understand trends in consumer behaviour (i.e. how many different individuals visit your store)? Do you have a lawful basis for collecting personal data, which includes biometric data using the technology?
It's also important to consider whether you're able to achieve your objectives via less invasive means.
2. What is the technology actually doing and does it use AI?
Understanding the mechanics of FRT is critical to ensure you can explain how the technology works to affected individuals and fulfil your transparency obligations under UK data protection law. Does the technology include hardware (i.e. CCTV cameras) and software (i.e. software that matches faces to a database or unique hexadecimal reference numbers)?
If the FRT is AI-enabled, there are heightened risks involved, including bias. While not directly applicable in the UK, the risk mitigation measures in the newly passed EU AI Act offer a useful framework for risk management.
3. What kind of data is being collected and processed?
There is a distinction between collecting biometric data (e.g. a person's face, voice or fingerprints) and special category biometric data (i.e. biometric data used for the purpose of uniquely identifying an individual). The latter is considered more sensitive and carries the risk of bias and profiling. As such, you will require a valid condition for processing it under UK data protection law, such as substantial public interest.
4. What have we done to mitigate the risks involved?
Have you prepared a data protection impact assessment? Have you undertaken due diligence on the supplier of the technology? Engaging with relevant stakeholders across your business (i.e. your DPO, IT team and HR) will simplify these processes. Ensure you've carefully reviewed the supplier agreement to understand how risk and responsibility are allocated and undertaken an information security assessment to satisfy yourself the technology is cybersecure.
5. Have you ensured that your internal governance structure is updated?
Think about what internal policies and procedures you will need to implement or update to support the introduction of the technology. Inform your employees and customers that they will be subject to FRT by updating your internal and external privacy information (i.e. your privacy policies). For a belt-and-braces approach, create public signage that alerts affected individuals to the use of the technology, and consider how you will respond to any objections you receive.
The rise of FRT and similar technologies raises questions, and so should you. Gather the information you need to understand what the technology you are rolling out is intended to achieve, how it works, how liability is apportioned and what risk mitigation measures will be implemented. To save face, ask questions and be prepared to answer them.
Ashley Avery is a partner and head of commercial, tech & data at Foot Anstey; Paolo Sbuttoni is a partner and Rachel Griffith is an associate at the same.