Apple is reportedly planning to incorporate miniature cameras into future iterations of AirPods and the Apple Watch, with a potential launch slated for 2027. According to sources including Bloomberg’s Mark Gurman and analyst Ming-Chi Kuo, these upcoming devices will be crucial in further integrating Apple Intelligence into the company’s hardware ecosystem.
The cameras are expected to enable a range of AI-driven functionalities, such as scanning and describing objects using a Visual Intelligence system, similar to what is found on newer iPhones.
While these wearables are unlikely to feature traditional photography or video calling capabilities like FaceTime, their primary purpose will be to gather visual data for on-device AI, fostering smarter and more contextually aware user experiences.
The upcoming generation of AirPods may include infrared cameras that could enable advanced features such as enhanced spatial audio when paired with Vision Pro, as well as support for gesture controls in the air, allowing users to interact with their devices through hand movements.
Analyst Ming-Chi Kuo had previously reported that Apple plans to begin mass production of these infrared camera-equipped AirPods by 2026, indicating a move towards more sophisticated and immersive wearable technology.
Apple is also said to be working on its own smart glasses, featuring integrated cameras, microphones, and AI-powered functions akin to Meta’s Ray-Ban smart glasses. These glasses are expected to incorporate a new energy-efficient chip, drawing inspiration from the Apple Watch’s design, to optimize battery life while supporting real-time AI functionalities.
While it remains uncertain which specific models will launch with camera capabilities, the inclusion of visual intelligence in wearables represents a significant step forward in Apple’s AI-driven product development.