I’ve been waiting to try Visual Intelligence ever since Apple first announced the iPhone 16 in September. As it turns out, my iPhone has a new button (camera control) and I didn’t use it to take photos.
I’ve been using the iPhone 16 Pro Max for a month, but the camera controls didn’t work for me. I take a fair amount of photos with my iPhone, and every time I try to use the dedicated buttons I find them annoying and confusing, and I’ve come to rely on my trusty touchscreen and lock screen shortcuts.
As we enter the iOS 18.2 developer beta, we must emphasize that it is a developer beta. Please do not try this on your primary device as it is still in development and not ready for public release. Everything I write in this article is related to the functionality itself, not performance. Additionally, if you want to try Visual Intelligence for yourself, we recommend waiting until the official release of iOS 18.2 later this year.
Anyway, let’s get back to visual intelligence. I currently have iOS 18.2 installed on my iPhone 16 Pro Max, and so far I’m a big fan of what Visual Intelligence is going to be like.
I hear you ask, What is visual intelligence? This is an Apple Intelligence feature exclusive to the iPhone 16 lineup that takes full advantage of camera controls. Long-press Camera Control to launch it and take a photo of what you’re looking at. From there, you can ask ChatGPT for information, search Google, or highlight text in a photo. Think of Visual Intelligence as Apple’s version of Google Lens, with its own hardware buttons that you can access on the fly.
My first impressions of Visual Intelligence
Visual Intelligence can be launched from anywhere (even from the lock screen), which makes it very useful whenever you want to do a quick search. My first test was to take a picture of the Game Boy camera that was on my desk. As I mentioned above, there are several options for visual intelligence, so I first used Google search to find the product. So I asked ChatGPT for some information, and he told me all about the history of the Game Boy Camera. From there you can ask follow-up questions, so I asked, “When was the Game Boy Camera released in Europe?” ChatGPT mandates correct answers.
Although still in development, Visual Intelligence worked well on recognizable products like the Game Boy Camera. I’m not sure how often I use Visual Intelligence to search for items, but considering it’s just a long press, it might become my go-to method for searching for something on the web. I don’t know.
Another great use for Visual Intelligence is when you want to see information about shops, cafes, bars, and restaurants while you’re on the go. I tested it at a local coffee shop and it didn’t work the way Apple showed it off in the demo, but I think that’s more to do with the early beta version I’m testing than the feature itself. I think so.
In its demo, Apple showed that visual intelligence can determine a dog’s breed. I tried this with my French Bulldog and although I was able to Google similar dogs, I couldn’t get a clear answer.
This is a summary of Visual Intelligence in its current form. This has great potential. I like how it gives real purpose to the camera controls. Also, when it works, it’s great. However, this is in a very early stage of development and, as expected, there are a lot of things that need to be ironed out.
One thing’s for sure, though: Visual Intelligence makes perfect sense to me now, and I finally understand why Apple added camera controls to the new iPhone. As long as it works smoothly, this is the kind of Apple Intelligence feature that people turn to when they need quick answers, and ChatGPT’s integration with Google makes it multifaceted.
I love testing new features in iOS. Every year, most of my iPhone’s life is spent in beta, but the iOS 18.2 developer beta feels like the most exciting beta yet. I’ve only been using the software for a few hours, and I still haven’t had access to Genmoji or Image Playground (I’m on a waiting list), but iOS 18.2 looks like the iOS 18 and Apple Intelligence we’ve all been waiting for. I can still confidently say that I feel that way.
I’ve just had a glimpse of what Visual Intelligence has to offer and I’m really looking forward to seeing the finished product later this year. Exclusive to the best iPhones, this could be your reason to buy the iPhone 16. Who would have thought this could be a camera control?