Apple is accelerating its efforts in artificial intelligence (AI) hardware with plans to release smart glasses by the end of 2026, while simultaneously halting development of a smartwatch equipped with a built-in camera designed to analyze the surrounding environment.
Sources familiar with the company’s strategy reveal that Apple engineers are intensifying work on their own AI-enabled smart glasses, aiming to rival Meta’s popular Ray-Ban smart eyewear series. Prototype production with overseas suppliers is expected to begin by the end of this year, marking a significant step toward mass production.
Apple’s smart glasses will feature integrated cameras, microphones, and speakers, allowing users to interact with their environment and issue commands via the Siri voice assistant. Capabilities will include phone calls, music playback, real-time translation, and turn-by-turn navigation. The device’s concept closely mirrors Meta’s current offerings and upcoming products powered by Alphabet’s Android XR operating system.
Internally dubbed the N50 project and part of a broader initiative known as N401, the glasses are still under active development, with the company’s final plans subject to change. Apple envisions these glasses evolving into augmented reality (AR) devices that overlay digital information onto the wearer’s real-world view—a goal expected to require several more years of innovation.
Earlier this month, Bloomberg reported that Apple is designing dedicated chips for the glasses, with mass production potentially starting as early as next year. Insiders note that while the product will be comparable to Meta’s smart eyewear, Apple aims to deliver superior build quality. Meta, meanwhile, is preparing a higher-end glasses model with a built-in display for notifications and photos, with true AR glasses slated for 2027.
Apple has also explored AI breakthroughs in other wearables. The company had been developing Apple Watch and Apple Watch Ultra models equipped with cameras to enhance environmental awareness, targeted for a 2027 release. However, this project was recently canceled, although work continues on AirPods featuring built-in cameras.
As competition intensifies, Apple faces mounting pressure to innovate across its device portfolio. The company has long integrated AI features into its iPhones, iPads, and Macs but risks falling behind in emerging categories. Apple’s AI platform, launched last year, currently lags competitors but is set to improve, including plans to open its Large Language Model (LLM) to third-party developers, fostering AI-driven apps on the App Store.
In addition to smart glasses, Apple intends to enter the foldable smartphone market by late 2026 and plans further new designs in 2027, responding to moves by other industry players.
Still, some Apple employees involved in smart glasses development express concern that the company’s relative AI disadvantages may impact new products. Meta’s Ray-Ban glasses leverage Meta’s Llama AI, and Google’s upcoming Android-powered glasses use its Gemini AI platform. Currently, Apple relies on Google Lens and OpenAI technologies for visual intelligence on iPhones but aims to replace these with proprietary solutions in future hardware.
Apple’s smart glasses are primarily developed by the Vision Products Group, the same team behind the Vision Pro headset. The group is also working on new Vision Pro versions, including a lighter, more affordable model and a Mac-connected version designed for low-latency applications.
The company’s wearable roadmap continues to shift; earlier plans for AR glasses requiring Mac connectivity were scrapped earlier this year, illustrating Apple’s cautious approach amid rapidly evolving AI and AR landscapes.
As Apple navigates these challenges, its 2026 smart glasses launch represents a pivotal move in the race to define the future of AI-powered consumer technology.