Snap, the company behind Snapchat, announced today it is working on the next iteration of its next AR glasses (aka “spec”).
SNAP released a 5-generation specification (Spectacles ’24) to developers only in the second half of 2024, and began selling it to students and teachers in January 2025 through its educational discount program.
Today, at awe 2025, Snap announced that it will release an updated version of its AR glasses for release next year. Evan Spiegel Teases, co-founder and CEO of SNAP, announced that “weigh more weight, gain more weight, and have more capacity.”
Beyond the launch window for 2026, there is no price or availability yet. I’ve never even seen the device in question to boot, but I don’t think it’s as thick as these.
Spiegel also noted that a powerful 4 million lens library that adds 3D effects, object, character, and AR conversions is compatible with future version specifications.
The company is currently not talking about specs, but the version, which was introduced in 2024 with a 46° field of view via an automatic color waveguide display, includes a dual liquid crystal in a silicon (LCOS) miniature projector that boasts 37 pixels per degree.
As a standalone unit, the device features a dual snap dragon processor, stereo speakers for spatial audio, six microphones for speech recognition, two high-resolution color cameras and two infrared computer vision cameras with 6DOF spatial recognition and hand-held.
I don’t know how these specifications will change in the next version, but I’m hoping for more than the original’s 45 minutes of battery life.

And as the company is preparing to release its first publicly released AR glasses, Snap has also announced major updates coming to SNAP OS. Key extensions include new integrations with Openai and Google Cloud’s Gemini, allowing developers to create multimodal AI-powered lenses for specs. These include real-time translations, currency conversions, recipe suggestions, interactive adventures and more.
Additionally, the new API is said to expand spatial and audio capabilities, including the Depth Module API, which locks AR content into 3D spaces, and an automated speech recognition API that supports over 40 languages. It is also said that the company’s SNAP3D API will enable real-time 3D object generation within the lens.
To build a location-based experience for developers, SNAP also states it has introduced a fleet management app, a guided mode for launching a seamless lens, and guided navigation for AR tours. Future features include Niantic Spatial VPS integration and WebXR browser support, enabling global shared AI assist maps and increasing access to WebXR content.
Once you release the spec to a consumer, you can place the snap in a unique position as the first mover. Companies including Apple, Meta and Google have not yet released their own AR glasses, but consumers should expect the past decade of racing to heat up. The overall consensus is that these companies are trying to own a key part of AR. This is because many people expect device classes to unlock smartphones as the dominant computing paradigm of the future.