Everything You Can Do with the New ‘Camera Control’ on the iPhone 16

Apple recently held an event to unveil the iPhone 16 and 16 Pro, along with new Apple Watch and AirPod models. Apple focused on new AI features, but more than the already well-known Apple Intelligence, it was the new button that was added to the iPhone for the first time in years that caught the eye.

ⓒ Apple

As rumored ahead of the event, all four new 2024 iPhones will feature a new capacitive gesture button called “Camera Control.” The button is located slightly recessed below the power button on the right edge. When the device is held in landscape mode, you can conveniently press it with your right index finger like a camera shutter. It also works in portrait mode.

As the name suggests, camera controls are used to take pictures. The main functions are as follows:

  • Launch the camera app : The most basic function is to open the camera app with a single click of the button.
  • Photo shoot : Once the camera app is open, you can click the camera control button again to take a picture. This time, click firmly, not lightly. Lightly pressing performs a different task.
  • Video recording : In video mode, clicking the camera control button will start recording. In photo mode, long-pressing will start video recording immediately.
  • Zoom in/zoom out : Instead of clicking a button in the viewfinder, lightly press it and a zoom dial will pop up next to your finger. Swipe the control left or right to zoom in or out.
  • Other camera controls : Double-tap the button to bring up a menu of various camera settings. Swipe to adjust options such as zoom, depth of field, and exposure.

These are the main features Apple listed on the iPhone 16 product page, but the camera controls don’t stop there. Apple said in the keynote that the camera controls will be used for an Apple intelligence feature called “Visual Intelligence.” If you point your iPhone at an object or location and long-press the camera controls, you’ll see AI analysis results for that object/location. For example, if it’s a dog, it’ll tell you the breed, and if it’s a restaurant, it’ll tell you the business hours. However, this feature isn’t coming right away. Apple said it’ll be added later this year.

Likewise, you can use the camera control to search for visible objects. For example, if you take a picture of a bicycle with the camera and click the button, you can use Google Image Search to find search information. Search using ChatGPT is also supported.

As with previous new hardware controls, Apple is hoping that third-party app makers will come up with creative ways to use the camera controls. Over the next few months, iPhone 16 and 16 Pro users will be able to quickly access apps and app features in a variety of ways via the camera controls. Whether this will be a hit or a flop remains to be seen, but for now, it has compelling potential.
editor@itworld.co.kr

Source: www.itworld.co.kr