The Windows Recall feature is being released in October, despite the tower of criticism that could come tumbling down on the company. Microsoft’s Recall feature was originally set for launch in July, but customers appeared staunchly against the feature and its ability to record all the work that they do on their laptops. The Windows AI Recall preview release was scheduled for all Microsoft Copilot PCs on June 18, 2024, but the company decided to shift it to a preview available on the Windows Insider Program (WIP) that users had to opt in to use.

On June 13, Microsoft stated that the feature would be available to test in the “coming weeks,” but weeks have turned into months, and landed on a date in October. Will the tests reveal a different response from what the Microsoft Recall feature launch originally saw? We’re curious to know just that.

Microsoft Recall feature launch

Image: Copilot

Windows Recall Feature Arrives in October, Windows Insider Community All Set To Test It

Microsoft Windows’ AI updates in September 2023 introduced us to the company’s Copilot AI which marked a fresh start to their commitment to artificial intelligence. Since then, glimpses of Copilot have been witnessed across Microsoft PCs, with dedicated spaces to interact with the chatbot right on your desktop. 

Microsoft took matters one step further in May, redesigning the latest Microsoft PCs with a Copilot toggle on the keyboards for easy access to the AI. Along with a slew of other enhanced capabilities, the Microsoft Recall feature was also launched. Users immediately took issue with the recording of their data, even if it was to be kept private, as this opened up a new avenue of vulnerabilities that the PC makers didn’t seem sufficiently prepared for.

Recall feature controversy

What is the Microsoft Recall Feature?

Along with the Microsoft Windows AI updates, the Recall feature was exclusively available to PCs that met Microsoft’s Copilot+ system requirement. It was designed to help users retrace their path and bring up information they had worked with previously. The Recall feature was immediately drowned in controversy due to its functionality—taking periodic screenshots of user activity to allow them to search for any information from the recorded content at a later point in time. 

The feature would use AI to take screenshots and scan them with optical character recognition (OCR) capabilities and save the information to a database. This database would make it so that users could ask their PC what someone had said in an email or what they had typed into a different document, and their natural language queries would be sufficient to summon the information back up for review, without having to hunt for the information across documents themselves. 

As convenient as that sounds, the feature was rightly described as a “security nightmare.” Using the AI Recall tool would mean that records of sensitive documents or personal information would be stored by the AI to summon at a later date. Researchers determined that the feature’s database was not encrypted and malware could just as easily explore what you did during the day. Anyone with physical or remote access to the PC would be able to export the database with all of your information instead of trying to break down the passwords to your many protected documents and accounts. 

The fact that the Microsoft Recall feature launch was expected to go out to all Copilot+ PCs without any safeguards in regard to these considerations was what shocked users the most, even if they could easily accept that Microsoft had cooked up an incomplete strategy. For now, the Windows Recall feature released in October is still restricted to the testers on the Windows Insider Program who volunteer to use it, and the company has made some changes to improve security and privacy.

Windows AI Recall release

What to Expect from the Windows AI Recall Release

The Microsoft Windows AI update does not elaborate on the changes made to the tool since June 13, but another blog will be published with more details. Until the Windows Recall feature releases in October, all we have to go on are the clarifications that were provided when the Recall feature controversy was at its peak.

Instead of having the Recall tool turned on for all users by default, users will now have to manually turn the feature on to make use of it. According to the Windows AI Recall release page, users will have to enroll in the Windows Hello service and provide proof of presence in order to review the contents being recalled. The Windows Hello Enhanced Sign-in Security (ESS) decryption support will ensure that user authentication is collected before providing any information from the database.

“When logged into your Copilot+ PC, you can easily retrace your steps visually using Recall to find things from apps, websites, images, and documents that you’ve seen, operating like your own virtual and completely private “photographic memory.” You are always in control of what’s saved. You can disable saving snapshots, pause temporarily, filter applications, and delete your snapshots at any time.”

Microsoft also shared details on the security systems enforced in Copilots+ PCs to ensure that they are less vulnerable to external threats. It’s good to see the company respond to user feedback and establish more procedures to address privacy concerns. When the Windows AI Recall tool is finally released, the response to it could be quite different from the fear-fueled approach that most of us have held onto so far. 

Google has introduced a similar feature with the Pixel 9 series of devices, where users can take screenshots and search through them using simple language, but the difference is that the recording of the information is entirely voluntary and within the users’ control. Perhaps with the changes that Microsoft has planned, the tool will be better received this time around.