Apple’s Visual Intelligence search feature is here to rival Google Lens, and the reaction to the new tool has been widely different. The iPhone AI visual feature was introduced during the “It’s Glowtime” event on September 9, and while Apple enthusiasts found it exciting, Android users were quick to say there was nothing new to see. 

 Apple organized the event primarily to bring the iPhone 16 lineup to its loyal customer base but we were also on the receiving end of a few unexpected software surprises, primarily all to do with artificial intelligence. The Apple Visual Intelligence AI tool is expected to come out “later this year” with future iOS updates so iPhone 16 series users have a lot to look forward to.

Apple visual intelligence AI

Image: Freepik

Apple Visual Intelligence Search—What Is It?

The iPhone’s AI visual feature will now allow users to take a picture of their surroundings to automatically look up some information in relation to the contents of the image. Using a “combination of on-device intelligence and Apple services that never store your images,” the AI tool will be able to use your camera to scan your environment and read out text for you or search for the breed of a cute dog you saw on the streets. 

Many users have pointed out that iPhones are already capable of doing this to a degree, but that doesn’t matter when you can re-advertise it explicitly as an AI tool. The tool can simplify your travel experience or work really effectively as a disability support tool for those who have visual issues, reading out or describing desired information to a user. 

This has been one of the only reasons we see the benefit of VR glasses, but having such functionality available on the smartphone doesn’t hurt either. The Visual Intelligence software could be the building blocks to future VR glasses from the company, but it doesn’t appear we’re any closer to seeing that particular product materialize.

Apart from just looking up information, the Apple Visual Intelligence search tool can also scan information from a poster or invite and add details such as the timing and the location presented in the document to your calendar, so it is more than just a scanner. Aspects like this make it more than just a Google Lens alternative, however, at the end of the day, it doesn’t offer too many other additional capabilities. 

Is the Apple Visual Search Technology Any Good?

It’s too soon to guess how well the tool will work and how accurately it can provide search results. The Apple tool may easily become a commonplace function for all users of the iPhone 16, but such a transition will take a while to occur. 

It also doesn’t help that the promotional video for the feature states that “if you come across a bike that looks exactly like the kind you’re in the market for, just tap to search Google for where you can buy something similar. The app integration is great to see, but it also doesn’t help that you could use Google Lens to do the same thing directly.

Apple did not elaborate on whether it will be able to translate text on sight, which is specifically what we use Google Lens for most often, but we’re hoping its capabilities aren’t limited to just identifying information and adding it to our calendar. 

Far be it from this solely being a duplicate Google Lens feature, other apps like Pinterest and Vinivo have their own in-built visual search tools. CamFind, PictureThis, and other similar apps function exclusively to see and identify content from your environment and give you relevant results. 

While some loyal fans of the brand are in awe of the tool, calling it “insane” and “phenomenal,” Android users are out and about, discussing Apple’s impressive ability to market a feature as entirely new.

google lens alternative

How Will the iPhone’s AI Visual Feature Work?

Apple’s visual search technology is only a button away for iPhone users, or to be more specific, it is for iPhone 16 users. The iPhone AI visual feature can be activated by the new Camera Control button available on all four models of the new iPhone 16 series. Using the handy button users won’t have to hunt for the feature but will be able to pull it up quickly to learn what they need to know about their environment.

The feature will support third-party integration so if you want ChatGPT’s help with homework, you will be able to reroute the information you capture to other apps. This makes it easier to integrate other AI tools that do more than just look up immediate information, allowing the Visual Intelligence to be more comprehensive by default.

Apple’s AI features have been reserved for the iPhone 15 Pro and all 16 series devices. With access to AI in the iPhone being limited already, the Apple Visual Intelligence search feature could further be restricted to iPhone 16 users. Apple hasn’t confirmed whether you strictly need the Camera Capture button to use the service or if can also just scroll through the menu to find the option to use the lens. In the former case, the access to the google lens alternative would be further restricted. This might be a strategy to push people to upgrade to the latest model.

Either way, the Apple visual intelligence AI tool should be out later this year and users will be able to get a more hands-on experience to determine how well the feature actually works.