Apple is reportedly developing a camera-equipped Apple Watch to enable advanced AI capabilities, including Visual Intelligence, according to Bloomberg’s Mark Gurman. The tech giant aims to introduce these enhancements within the next two years, as detailed in Gurman’s latest Power On newsletter.
Camera Placement and Functionality
Sources indicate that the standard Apple Watch Series will feature an under-display camera, while the rugged Apple Watch Ultra will house its camera on the side, adjacent to the digital crown and button. These cameras will allow the device to analyze its surroundings using AI, delivering context-aware information to users. A similar strategy is said to be in play for rumored camera-equipped AirPods.
Visual Intelligence Expansion
The Visual Intelligence system, first introduced on the iPhone 16, leverages AI to perform tasks such as extracting event details from flyers or retrieving restaurant information. Initially powered by third-party AI models, Apple reportedly plans to transition to its own proprietary AI technology by 2027—the expected launch window for the revamped Apple Watch and AirPods.
Leadership and Future Wearables
The development of these AI-driven wearables falls under the leadership of Mike Rockwell, who recently took charge of accelerating Apple’s delayed Siri large language model (LLM) upgrade. Rockwell, previously overseeing the Vision Pro headset, is also expected to continue work on visionOS. This software may eventually support another AI-focused wearable—augmented reality glasses similar to Meta’s Orion concept, though their release remains years away.
Apple’s push into AI-enhanced wearables signals a broader strategy to integrate intelligent features across its product ecosystem, blending hardware innovation with advanced machine learning.