In response to security concerns, Microsoft has provided details on how it has revamped its controversial AI-powered Recall feature, which captures screenshots of nearly everything you do or see on a computer.
Originally slated to debut with Copilot Plus PCs in June, Microsoft has spent the last few months improving the security behind Recall, making it an opt-in feature that users can fully uninstall from Windows if desired.
“I’m actually really excited about how deeply we dived into the security architecture,” said David Weston, Microsoft’s vice president of enterprise and OS security, during an interview.
“I think the security community is going to appreciate how much effort we’ve put into enhancing Recall.”
One of the major changes Microsoft has made is that Recall will no longer be enabled by default.
“There is no more default on experience at all—you have to opt into this,” said Weston. “That’s crucial for people who don’t want the feature, and we fully understand that.”
Earlier this month, an uninstall option for Recall appeared on Copilot Plus PCs, which Microsoft initially referred to as a bug.
However, it turns out users will indeed be able to fully remove Recall. “If you choose to uninstall it, we remove everything from your machine,” Weston explained, including the AI models powering the feature.
Initially, security researchers found that Recall’s database, which stores screenshots taken every few seconds, was not encrypted, leaving it vulnerable to malware attacks.
Microsoft has since encrypted all sensitive elements of Recall, including its screenshot database, and is using Windows Hello to safeguard against tampering.
The encryption of Recall is now tied to the Trusted Platform Module (TPM), which is required for Windows 11. This means the encryption keys are stored in the TPM, and access to them is granted only through authentication via Windows Hello.
The only time Recall data is sent to the user interface is when the user activates the feature and authenticates via facial recognition, fingerprint, or PIN.
“To enable Recall, you need to be present as the user,” said Weston. This means users must use facial recognition or fingerprint authentication to set up the feature, which will then allow PIN-based access.
This setup is intended to prevent malware from exploiting Recall data in the background, with Microsoft requiring user presence verification through Windows Hello.
“We’ve moved all screenshot processing and sensitive operations into a virtualization-based security enclave, essentially placing them inside a virtual machine,” Weston added.
This design ensures that the user interface (UI) layer of the app does not have access to raw screenshots or the Recall database.
When a user interacts with Recall and conducts searches, Windows Hello prompts authentication, queries the virtual machine, and retrieves data. Once the user closes the Recall app, all memory is erased.
“The app outside the virtualization enclave is protected against malware, and any malicious attempt would require a kernel driver to breach it,” said Weston.
Microsoft has outlined the full security model of Recall and its Virtualization-Based Security (VBS) enclave in a blog post.
This approach looks significantly more secure than the original design and may provide a look into how Microsoft could enhance the security of future Windows applications.
As for why Microsoft almost launched Recall in June without these robust security measures, the details are somewhat unclear.
Weston confirmed that Recall had been reviewed under Microsoft’s Secure Future Initiative, introduced last year, but being a preview product, it had certain restrictions.
“The plan was always to include Microsoft’s fundamental security features, like encryption.
But we also received feedback from people who were very concerned about the risks,” Weston noted, leading Microsoft to accelerate additional security measures planned for Recall to ensure user concerns wouldn’t deter adoption.
“This goes beyond just Recall,” Weston hinted. “We now have one of the strongest platforms for handling sensitive data processing on the edge, and this opens up a lot of other possibilities.”
He emphasized that accelerating these security investments has positioned Recall as a premier platform for data protection.
Another important change is that Recall will only be available on Copilot Plus PCs, preventing users from sideloading it onto other Windows machines as seen earlier this year.
To verify a Copilot Plus PC, Recall will check for the presence of features like BitLocker, virtualization-based security, secure boot, system guard protections, and kernel DMA protection.
Microsoft has also conducted extensive reviews of Recall’s updated security. The Microsoft Offensive Research Security Engineering (MORSE) team spent months conducting design reviews and penetration tests on the feature.
Additionally, a third-party security firm performed an independent security design review and testing.
With more time to work on Recall, Microsoft has introduced additional control features. Users will now be able to filter out specific apps and block certain websites from being captured by Recall.
Sensitive content filtering will also prevent the storage of passwords, credit card information, health data, and financial details.
Users will even have the option to delete specific time ranges, content from particular apps or websites, or the entire Recall database.
Microsoft remains on track to preview Recall with Windows Insiders on Copilot Plus PCs in October, meaning that the feature won’t ship with new laptops or PCs until after the Windows community has thoroughly tested it.