If you’ve read my previous thoughts on the iPhone on TechRadar and our sister site Tom’s Guide, you know that I have pretty strong opinions about Apple’s smartphones.
Since moving from Android to iPhone at the end of 2021, I’ve never been able to go back to the platform built by Google, even though I’ve tried some of the best Android smartphones. The ease of use of iOS won me over. I love the titanium construction, find the Ceramic Shield glass to be a minor game changer, and love the action buttons. And the camera rarely disappoints me on the iPhone.
But for once, I’m on the brink.
What got me thinking is the camera control “button”. In a way, this is a great new feature that takes advantage of the sense of touch. In another sense, it is redundant and under-powered.
I’ve been experimenting with the iPhone 16 Pro Max over the past few weeks, and I try to use the camera controls as much as possible when taking photos. I’m 37 years old and a millennial, and I still like taking photos in landscape mode with my phone. So having a physical button where your finger naturally rests is a good way to capture a shot without having to tap the screen or move your finger to disrupt the framing. Press the action button – I mapped this to trigger the “torch” anyway, which is surprisingly useful.
I also like that I can swipe the camera controls and flick the zoom range without having to tap the little icon. Exposure controls are cool in a way, but toggling between the functions you can control with the camera controls is still not intuitive, and the precise design of a scene is often lost when you tap on it.
Oh yeah, the camera controls are interesting. but…
Did someone really ask this? This feels like a feature for Apple’s mobile executives to talk about something new at the September Apple event. It’s just a nice-to-have feature, but it doesn’t revolutionize cell phone photography.
not my tempo
However, you may get used to it over time. But the biggest problem is the lack of AI tools for camera control at launch. Apple is actively touting Camera Control’s AI features, which can be used to smartly identify what the camera is pointing at and provide all sorts of information. That hasn’t happened yet, and the rollout will happen after launch, when Apple Intelligence fully arrives. There’s also a beta option, but I don’t feel like trying it on my primary phone.
I still don’t understand it. Sure, other phone manufacturers are touting AI features that will be available post-launch on their phones, and while they may be limited to certain regions to begin with, at least some of the promised AI suites are available. It will be released with a part included. The iPhone 16 series was launched without Apple Intelligence features.
This wasn’t what I expected from Apple, a company famous for not adopting new technology until it’s polished and ready for prime time. Therefore, launching a smartphone without next-generation smarts is puzzling to me. But that’s also the main reason I struggle with camera controls. If features like Google Lens were built into the hardware format at launch, we could be more proactive about camera controls.
Of course, if Apple uses a camera button like this, other phone manufacturers will no doubt follow suit. I just hope they don’t skimp on features when they launch their phones.
As for the current camera controls, I’m going to continue using them without prejudice. I hope it will be very useful once it has a certain amount of AI smart features.