Analysts share security concerns about Microsoft's 'Recall' feature
'An act of self-harm in the name of AI'
Microsoft's unveiling of the "Recall" feature at Build 2024 continues to generate controversy, with industry analysts joining the call for a rethink.
Recall, a Windows feature currently in preview, captures screenshots of a user's screen every few seconds and stores them locally. While intended to enable users to easily search and revisit past activities, the feature has raised serious security and privacy concerns.
"Should Microsoft Recall Be Recalled?" asked analyst and former CTO of Microsoft's IT unit Barry Briggs in a blog post, echoing existing concerns about capturing sensitive data in screenshots.
"Recall will record sensitive interactions – such as, for example, the browser screen from your bank, the Monday Night Football game you were furtively watching during a late meeting… or other content you perhaps might not want captured," Briggs said.
"Whether or not Recall records passwords such that they can be extracted isn't yet clear – but maybe, or maybe in some cases. All this seems very scary indeed and I for one will waste no time turning Recall off."
Briggs questioned the logic behind Recall, highlighting Microsoft's existing Purview compliance services.
Purview allows companies to monitor employees' activities, raising the possibility of similar functionality within Recall, potentially by compliance teams or even law enforcement.
Briggs said it was highly likely that "well-funded and well-trained foreign actors" will soon try to break the code to steal users' confidential information.
"Will they be successful? Who knows, but Recall has presented them with yet another target."
Briggs also pointed out Microsoft's existing, neglected search functionality, questioning why the company chose to prioritise Recall over improvements in a core Windows feature.
He challenged the value proposition of Recall for both users and businesses.
Cybersecurity researcher Kevin Beaumont recently claimed to have found significant security flaws in Recall, discovering that it saves screenshots in an unencrypted, plain text database accessible on the PC itself.
"This database file has a record of everything you've ever viewed on your PC in plain text," Beaumont said, highlighting the vulnerability.
Microsoft maintains that Recall is secure, with on-device data encryption and privacy controls like URL and app filtering.
However, Beaumont claims the encryption is limited. He pointed out that the data is decrypted when a user is logged in and actively using the PC, making it vulnerable to malware attacks.
Privacy advocates are also wary of Recall's "opt-out" approach. While users can disable the feature entirely, there is a possibility that it will be able to be remotely enabled by administrators.
Beaumont compared the situation to "watching Microsoft become an Apple Mac marketing department," implying a disconnect from user needs.
"Windows is a personal experience. This shatters that belief," he said.
"I think they are probably going to set fire to the entire Copilot brand due to how poorly this has been implemented and rolled out. It's an act of self-harm at Microsoft in the name of AI, and by proxy real customer harm."