Said to be a direct competitor to the popular Ray-Ban Meta smart glasses, Apple’s frames are expected to feature in-built cameras designed to work with its upcoming Visual Intelligence technology, set to debut on the iPhone 16 range in the near future.
Essentially Apple’s take on Google Lens, the Visual Intelligence tool will be used to scan objects and places, allowing users to instantly access information about their surrounding environment. For instance, an iPhone 16 owner could use their handset’s camera to scan a restaurant to access menu information and user reviews. Gurman, a respected Apple insider, said that the Cupertino company is looking to “salvage the billions of dollars spent on the Vision Pro’s visual intelligence technology, which can scan the environment around a user and supply useful data.”
And while it’s hard to say how AirPods with cameras could benefit the average user, Gurman has previously stated that the rumored product “would give consumers many of the benefits of smart glasses without needing lenses and frames.” Of course, if Gurman’s report is to be believed, we’re a few years out from seeing these products go on sale, we may have to settle for the Apple Intelligence features which are expected to arrive with iOS 18.2in December. In the meantime, you can access similar features such as the ability to take photos, record video stories, and access an AI voice assistant via the Ray-Ban Meta smart glasses.
- HTC Unveils Eye-Tracking Vive Focus Vision Standalone VR Headset
- DogWifHat Community Raises $690K To Put Meme On Vegas Sphere
- Play Solana Unveils First-Ever Crypto Gaming Handheld Device
- Apple Reportedly Slashes Apple Vision Pro Shipments & Price Due to Unexpectedly Low Demand
- Ledger Unveils New Flex Hardware Wallet at Bitcoin Nashville
- Mercedes-Benz Unveils in-car virtual assistant with NFT and generative AI capabilities