Microsoft to discontinue face recognition as it updates AI guidelines
Amid the ongoing debate about the controversial use of facial recognition technology, Microsoft has announced it is retiring certain AI facial recognition capabilities.
The capabilities include those can infer emotional states, and identity attributes such as age, gender, smile, hair, and makeup.
The decision is part of a more extensive revamp of Microsoft's ethical AI guidelines. The company's revised Responsible AI Standards, which include guidelines for building AI systems, place an emphasis on the responsibility to find out who uses its services, and greater human control over how these tools are used.
A responsible approach to AI means keeping people and their goals at the forefront of design choices, says Microsoft.
'AI systems are the product of many different decisions made by those who develop and deploy them. From system purpose to how people interact with AI systems, we need to proactively guide these decisions toward more beneficial and equitable outcomes,' the company said in a blog post.
'That means keeping people and their goals at the center of system design decisions and respecting enduring values like fairness, reliability and safety, privacy and security, inclusiveness, transparency, and accountability.'
For example, technologies that could be used to make decisions about a person's access to employment, healthcare or financial services like mortgage applications are subject to a human-led review.
The move to restrict access to some facial recognition functions has been taken with this in mind.
New public users will no longer be able to use these characteristic detection features, while existing users still have access until the 30th June 2023.
The firm is not abandoning facial recognition, though: it will incorporate the technology into 'controlled' accessibility tools like Seeing AI, which assists users with vision problems.
Microsoft focused on the case of emotion classification specifically, pointing out the 'important questions' about privacy that the technology raises and the lack of agreement on what constitutes 'emotions'.
Critics have also voiced doubts about the inability to make the relationship between facial expression and emotional state across use cases, regions, and demographics - for example, any systems struggle when faced with a non-white face.
The Custom Neural Voice function, which allows users to create AI voices based on recordings of actual people (audio deepfakes), will also be subject to similar restrictions as the facial recognition tech.
Sarah Bird, principal group product manager for Azure AI at Microsoft, says the voice tool has potential in the areas of education, accessibility, and entertainment. However, she also notes that it is easy to imagine how it could be used "inappropriately" (read: to scam people).
Developers who want to use the face recognition and voice impersonation features in the future will need to apply for access and explain how they intend to deploy it.
Microsoft is not the first company to have second thoughts about facial recognition.
Two years ago, IBM stopped its work in the area because of concerns that its initiatives could be used to violate human rights.
In a letter to the members of the US Congress in June 2020, IBM CEO Arvind Krishna said the company would no longer sell general purpose facial recognition software, and would oppose any use of such technology for racial profiling, mass surveillance, violations of basic human rights or any purpose "which is not consistent with our values and principles of trust and transparency".
Facebook shuttered its decade-old face recognition system last year, amid concerns about how the technology was being used.
Last year, the European Union presented the bloc's first-ever legal framework on regulating high risk applications of AI technology. The lawmakers said they want to achieve 'proportionate and flexible rules' to address the risks of AI and to strengthen Europe's position in setting the highest standard in regulating the AI technology.