Apple is reportedly gearing up to embed miniature cameras into future versions of its AirPods and Apple Watch, with potential launches anticipated as early as 2027. According to well-informed sources, including Bloomberg’s Mark Gurman and respected analyst Ming-Chi Kuo, this move is a pivotal step in deepening the integration of Apple’s AI capabilities within its hardware ecosystem.
These embedded cameras are expected to power a suite of AI-driven features, enabling devices to scan and interpret objects through a sophisticated Visual Intelligence system, akin to the technology currently available on the latest iPhones. Rather than focusing on conventional photography or video calls such as FaceTime, these cameras will primarily collect visual data to enhance on-device AI, delivering smarter and more context-aware user experiences.
The next generation of AirPods could feature infrared cameras designed to unlock advanced functionalities. These may include improved spatial audio experiences when paired with Apple’s Vision Pro headset, as well as gesture control capabilities that allow users to interact with their devices through hand motions in mid-air. Ming-Chi Kuo has indicated that mass production of these infrared camera-equipped AirPods could begin by 2026, signaling Apple’s intent to push the boundaries of immersive wearable technology.
In parallel, Apple is also developing smart glasses equipped with cameras, microphones, and AI-powered features similar to Meta’s Ray-Ban smart glasses. These glasses are expected to house a new, energy-efficient chip inspired by the Apple Watch’s architecture, ensuring prolonged battery life while enabling real-time AI processing.
While it remains unclear which specific device models will debut with these camera enhancements, the move marks a significant leap in Apple’s commitment to embedding visual intelligence into its wearable lineup, setting the stage for a new era of AI-enabled consumer technology.