As it turns out, customer protests do work sometimes—Microsoft has delayed the release of AI Recall. The company has been promoting its Copilot AI tool and all its associated features relentlessly since they were formally announced in 2023, and there has been a largely mixed response to their attempts at AI. Unfortunately, after the recent discussion around the controversial AI Recall feature, the feedback has been largely negative, with users expressing their concerns over security and privacy. The tool was supposed to make a debut on June 18, 2024, but Windows has decided to delay the AI feature’s global launch, choosing to release it for a limited audience as a part of the Windows Insider Program (WIP). 

Windows AI feature delay

Image: Microsoft CEO Satya Nadella

Microsoft Delays AI Recall—Privacy Is a Thing of the Past

Security concerns around AI tools are nothing new and we’ve gone over the conversation about this technology being trained on our personal data is dangerous over and over again. The controversial AI Recall feature took matters one step further when Microsoft announced that the feature would let the AI assistant take screenshots of your computer screen at regular intervals to help the assistant learn about the user and better serve them. 

The “everyday pilot” is expected to help us be “more connected to everything that matters to us,” but to do this, Microsoft believes it needs constant access to your personal information so you never have to think about where a particular piece of information is ever again.

Before the Windows AI feature was delayed, what we did know about the service was that it relied on its “photographic memory” to take snapshots of your activity and extract data from there to keep a record of what you do on your device. “With Recall, we’re going to leverage the power of AI and the new system performance, to make it possible to access virtually anything you have ever seen on your PC,” is what Yusuf Mehdi, Executive VP, Consumer CMO, had to say about the tool. 

What is Microsoft’s AI Recall?

Our PCs have data scattered across their nooks and crannies and it is often hard to recall where you saw an image or which document had a particular piece of information. Unless you remember the name of the document or file, it is difficult to find that piece of information again. With Microsoft’s AI Recall, you can use simple language to ask the AI about the information and let it do the searching for you. Recall periodically takes a snapshot of everything you do on your device and stores it for later use, analyzing the data in the image rather than just saving a bunch of screenshots. 

When you look for a specific piece of information, the AI is able to scour the images for relevant information and bring up the one that matches your search. This way, the AI doesn’t need access to your apps, nor does it need to jump through the loopholes of securing permission from all the different services you use on your device. 

All it needs to do is capture images of your activity, even if it involves a private chat. Microsoft’s AI Recall is adept at semantic learning and association, picking up as much data as possible from the images and making a connection even if your search terms are never directly used within the image.

There is no denying it—Microsoft’s AI Recall is a useful tool and it could really simplify how you bring up information, especially when you only remember fragments of what you need. However, having data stored like this feels very unsafe for most users, despite Microsoft’s attempt at assuaging these AI security concerns.

Microsoft AI security concerns

Image: Microsoft aims to simply the search experience on-device with natural language prompts

Understanding the Microsoft AI Security Concerns

When asked by an interviewer “There could be some reaction from people that This is pretty creepy, Microsoft is taking screenshots of everything I do,” Microsoft CEO Satya Nadella responded, “I mean that’s why you can only do it on the Edge, you have to put two things together—this is my computer and this is my Recall. It’s all being done locally.” While the AI still sounds creepy, the company tried to reassure users that the Microsoft AI security concerns were unwarranted as the storage of this data was all happening locally on the device—a place where the data already exists the first time around. 

Mehdi had similarly reassured users when the feature was announced, “We’ve built Recall with responsible AI principles and aligned it with our standards. And we’ve taken a very conservative approach. We’re going to keep your Recall index private, and local, and secure on just the device.” According to the company, none of the recorded data is used to train the AI and you also have the freedom to edit and delete any of the stored information. 

Realistically, we never let go of our junk until we get a reminder that our storage is full and we’re left with no choice. Will users, even those concerned about the controversial AI Recall feature, regularly review and delete sensitive information? It’s unlikely.

Explaining Microsoft AI’s security concerns, many have pointed out that the plain text database of the stored information could make it easy for an attacker to access your information without working too hard to do it. With the rise of malware and cybercrime in general, this could immediately turn Microsoft Recall into a tool that criminals exploit. Unless Microsoft puts multiple precautionary, encrypted, unbreachable layers between your data and an outsider, there is no way for them to claim the Recall tool is safe to use.

Controversial AI recall feature

Microsoft Delays AI Recall’s Full-Scale Release 

The feature was all set for release on the Copilot+ PCs that are gearing up for launch, but for now, it will only be available when users choose to enable it. Microsoft may have delayed AI Recall, but they aren’t going so far as to cancel the feature just yet. A preview version should be more widely available after a few more months of feedback collection and security tests. Announcing the Windows AI feature delay, the company also shared a few of the measures they’ve taken after considering feedback from users.

This begins with improving the set-up experience on upcoming Copilot+ PCs where users can clearly understand how to opt in to enable the Recall feature as it will be turned off by default. The company also announced that authentication will be required to access the timeline, with “just-in-time decryption protected by Windows Hello Enhanced Sign-in Security (ESS) so Recall snapshots will only be decrypted and accessible when the user authenticates.” Microsoft went on to assure users that the PCs themselves have advanced security measures in place with Microsoft Pluton security processors enabled by default. 

Microsoft’s decision to delay AI Recall appears to be the best-case scenario of what we could have expected from the company. The controversial AI Recall feature is not going anywhere permanently, but perhaps we will have more evidence of its safety measures and the additional protections put in place before we see a widespread release of the tool.