Apple is pushing forward with innovative AI-driven wearable technology, including plans to integrate cameras into future Apple Watch models. According to Bloomberg’s Mark Gurman, the company is testing smartwatches equipped with visual intelligence capabilities, allowing users to interact with their surroundings through AI-powered features.
This technology, already available on the iPhone 16 series running iOS 18.2, mirrors functionality similar to Google Lens. The implementation could vary between models, with the standard Apple Watch potentially embedding the camera within the display, while the Apple Watch Ultra might position the sensor near the crown and power button.
Expanding Wearable AI Beyond Smartwatches
Apple’s ambitions extend beyond smartwatches. Recent reports indicate the company is also developing AirPods with built-in cameras, further integrating visual intelligence into its ecosystem. These advancements suggest a broader strategy to enhance AI-driven interactions across Apple’s product lineup.
While the camera-equipped Apple Watch is still in development, Gurman estimates a possible launch by 2027. If successful, the move could redefine how users engage with wearable technology, blending AI and real-world interaction in new ways.
Apple has not officially confirmed these plans, but the exploration signals the company’s commitment to advancing AI capabilities in its devices.