Apple Watch to Feature Cameras for AI Visual Intelligence

Key Takeaways

1. Apple is developing new Apple Watch models with built-in cameras, expected to launch by 2027.
2. The cameras will enhance Visual Intelligence, providing contextual data from the user’s environment.
3. Regular Apple Watch versions may feature an under-display camera, while the Apple Watch Ultra is likely to have a visible camera module.
4. The cameras are not intended for selfies or video calls, focusing instead on delivering AI-generated insights.
5. Apple is also working on AirPods with infrared cameras to improve spatial audio and enable gesture-based controls.


Apple’s plans for wearable AI are getting a serious upgrade in hardware. Mark Gurman from Bloomberg shared that the tech giant is working on new Apple Watch models that will include built-in cameras, signaling a major change in how the device connects with its environment. These upcoming Apple Watches, which are expected to launch by 2027, will utilize cameras to analyze surroundings, offering contextual data directly from your wrist.

Cameras on Apple Watches — Not for Selfies

For the regular Apple Watch versions, it appears that Apple is looking into integrating a camera within the display. However, it’s not yet clear if this will be achieved with an under-display sensor or an obvious cutout. On the more robust Apple Watch Ultra, the approach seems more feasible: a visible camera module situated next to the Digital Crown, allowing users to easily point their wrist and quickly scan items.

These cameras won’t be designed for taking selfies or making FaceTime calls—those concepts are still too ambitious for such a small screen. Instead, the emphasis is on enhancing Visual Intelligence, a feature introduced with the iPhone 16 that enables users to aim their device at an object or text to receive immediate AI-generated insights. Imagine something similar to Google Lens, but more seamlessly integrated into Apple’s ecosystem, and soon available on your wrist.

Apple Is Expanding Its Vision

But that’s not all. The company is also reportedly developing AirPods that come with built-in infrared cameras. Though specifics are limited, the aim seems to be improving spatial audio features and possibly gesture-based controls that could interact with augmented reality settings. Picture turning your head or waving your hand in the air, and having your devices respond naturally.

All of this will depend on Apple transitioning Visual Intelligence from depending on external AI models like ChatGPT to utilizing its own internal systems. That’s a significant challenge, but if Apple manages to succeed, the wearable tech landscape could be transformed dramatically by the end of this decade.

Source:
Link

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *