Recently, Apple is reportedly developing new chips for smart glasses that aim to compete with future smart glasses. Bloomberg Report from Mark Garman.
Apple’s smart glasses chips are reportedly based on the low-energy processor used by Apple Watches, which is optimized for power efficiency and the ability to control multiple cameras.
The report is expected to start production of chips by the second half of 2026 or 2027, and will place the devices for market launch within the next two years. Taiwan Semiconductor Manufacturing Co., Apple’s longtime chip partner, is expected to handle production.
“Apple is currently using cameras to scan the surroundings and explore non-AR (smart) glasses that rely on AI to help users,” says Gurman. “While devices are similar to meta products, Apple still knows the exact approach they want to adopt. iPhone manufacturers need to significantly improve their own AI-centric devices before companies can deploy attractive AI-centric devices.”
Regarding the continuation of Apple’s augmented reality efforts, Bloomberg In April, Apple CEO Tim Cook reported that he “don’t care about anything else” than breaking the meta with AR glasses.
In short, smart glasses like Metaray-Ban glasses can play audio, take photos, make calls, and access voice assistants. The latest version of the device released in 2023 is extremely successful, but Meta is reportedly set to release the next generation device and include a single head-up display.
Meanwhile, things like all-day AR glasses like Apple, Google, Meta and others are hoping to build a few more steps. AR glasses overlay digital content in the real world, mixing virtual objects and information with physical environments via transparent displays, requiring more advanced sensors, displays, optics, processors, batteries and cooling management.