Said to be a direct competitor to the popular Ray-Ban Meta smart glasses, Apple’s frames are expected to feature in-built cameras designed to work with its upcoming Visual Intelligence technology, set to debut on the iPhone 16 range in the near future.
Essentially Apple’s take on Google Lens, the Visual Intelligence tool will be used to scan objects and places, allowing users to instantly access information about their surrounding environment. For instance, an iPhone 16 owner could use their handset’s camera to scan a restaurant to access menu information and user reviews. Gurman, a respected Apple insider, said that the Cupertino company is looking to “salvage the billions of dollars spent on the Vision Pro’s visual intelligence technology, which can scan the environment around a user and supply useful data.”
And while it’s hard to say how AirPods with cameras could benefit the average user, Gurman has previously stated that the rumored product “would give consumers many of the benefits of smart glasses without needing lenses and frames.” Of course, if Gurman’s report is to be believed, we’re a few years out from seeing these products go on sale, we may have to settle for the Apple Intelligence features which are expected to arrive with iOS 18.2in December. In the meantime, you can access similar features such as the ability to take photos, record video stories, and access an AI voice assistant via the Ray-Ban Meta smart glasses.
- Neuralink Patient’s Implants Slipped Out, But He Still Set a Brain Control Record
- Ledger Users Can Now Buy Assets From Coinbase Without First Transferring Assets Out of Their Wallet
- UFC and VeChain Partner to Introduce Tokenized Gloves at UFC 300
- Tech Hardware Start Up Humane’s AI Pin is Now Available, Priced at $699
- Blackmirror Launches ‘Smile Club’, A Web3 Venture into Digital Ownership & Social Credit Scores
- Solana’s New Phone ‘Seeker’ Promises More for Less—And Already Seeing Huge Demand