Said to be a direct competitor to the popular Ray-Ban Meta smart glasses, Apple’s frames are expected to feature in-built cameras designed to work with its upcoming Visual Intelligence technology, set to debut on the iPhone 16 range in the near future.
Essentially Apple’s take on Google Lens, the Visual Intelligence tool will be used to scan objects and places, allowing users to instantly access information about their surrounding environment. For instance, an iPhone 16 owner could use their handset’s camera to scan a restaurant to access menu information and user reviews. Gurman, a respected Apple insider, said that the Cupertino company is looking to “salvage the billions of dollars spent on the Vision Pro’s visual intelligence technology, which can scan the environment around a user and supply useful data.”
And while it’s hard to say how AirPods with cameras could benefit the average user, Gurman has previously stated that the rumored product “would give consumers many of the benefits of smart glasses without needing lenses and frames.” Of course, if Gurman’s report is to be believed, we’re a few years out from seeing these products go on sale, we may have to settle for the Apple Intelligence features which are expected to arrive with iOS 18.2in December. In the meantime, you can access similar features such as the ability to take photos, record video stories, and access an AI voice assistant via the Ray-Ban Meta smart glasses.
- Sui Gears Up to Launch Web3 Gaming Device, Slated For 2025 Release
- Worldcoin Rebrands as World, Launches New Eyeball Scanning Orb
- Leak Confirms that Sony will Debut XR Headset this Year
- Solana’s New Phone ‘Seeker’ Promises More for Less—And Already Seeing Huge Demand
- Fetch.AI Announces GPU Rewards for Token Holders After $100M Infrastructure Investment
- Tesla’s Optimus Bot Makes a Scene at The Robotaxi Event, Robovan Revealed