Said to be a direct competitor to the popular Ray-Ban Meta smart glasses, Apple’s frames are expected to feature in-built cameras designed to work with its upcoming Visual Intelligence technology, set to debut on the iPhone 16 range in the near future.
Essentially Apple’s take on Google Lens, the Visual Intelligence tool will be used to scan objects and places, allowing users to instantly access information about their surrounding environment. For instance, an iPhone 16 owner could use their handset’s camera to scan a restaurant to access menu information and user reviews. Gurman, a respected Apple insider, said that the Cupertino company is looking to “salvage the billions of dollars spent on the Vision Pro’s visual intelligence technology, which can scan the environment around a user and supply useful data.”
And while it’s hard to say how AirPods with cameras could benefit the average user, Gurman has previously stated that the rumored product “would give consumers many of the benefits of smart glasses without needing lenses and frames.” Of course, if Gurman’s report is to be believed, we’re a few years out from seeing these products go on sale, we may have to settle for the Apple Intelligence features which are expected to arrive with iOS 18.2in December. In the meantime, you can access similar features such as the ability to take photos, record video stories, and access an AI voice assistant via the Ray-Ban Meta smart glasses.
- Apple’s Vision Pro Team is Reportedly Focused on Building a Cheaper, Lighter Headset Due to Low Sales
- Blackmirror Launches ‘Smile Club’, A Web3 Venture into Digital Ownership & Social Credit Scores
- Worldcoin Rebrands as World, Launches New Eyeball Scanning Orb
- DePIN Platform Peaq Raises $15M Ahead of Launch
- Mercedes-Benz Unveils in-car virtual assistant with NFT and generative AI capabilities
- Leak Confirms that Sony will Debut XR Headset this Year