Close Menu
  • Home
  • Android
  • Android Operating
  • Apple
  • Apps
  • Gadgets
  • Galaxy
  • Ipad
  • IPhone
  • Smartphone
  • Tablet

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Muse Dash, Hyperforma, Tower of Fortune 4, etc.

March 28, 2025

Best Kitchen Gadgets of 2025

March 18, 2025

The best drawing tablets of 2025: Expert tested and recommended

February 13, 2025
Facebook X (Twitter) Instagram
  • Home
  • About Us
  • Advertise with Us
  • Contact us
  • DMCA Policy
  • Privacy Policy
  • Terms & Conditions
Facebook X (Twitter) Instagram
Wtf AndroidWtf Android
  • Home
  • Android
  • Android Operating
  • Apple
  • Apps
  • Gadgets
  • Galaxy
  • Ipad
  • IPhone
  • Smartphone
  • Tablet
Wtf AndroidWtf Android
Home » Tried the new visual intelligence of iPhone 16, feels the future
IPhone

Tried the new visual intelligence of iPhone 16, feels the future

adminBy adminOctober 25, 2024No Comments7 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr Email
Share
Facebook Twitter LinkedIn Pinterest Email


When you pass by a Japanese tea shop in New York’s Bowery Market, just point your iPhone at the storefront and press and hold the button on the side of your phone to see business hours, photos from customers, and to call the shop. options are displayed. Or order.

Apple’s new visual intelligence tool, which will be available on the iPhone 16 lineup, takes the intermediate step of unlocking your phone, opening Google or ChatGPT, and typing a query or uploading a photo to get an answer. It is intended to be omitted. An early version of the feature is available as part of Apple’s iOS 18.2 developer beta, which launched for program participants on Wednesday.

The version I tried was an early preview aimed at developers rather than consumers, but it gave me a good idea of ​​how visual intelligence works and what it brings to the iPhone experience. We briefly tested this very early version and found it to be great for quickly retrieving information about points of interest on the fly. While it can be useful, I also think it will take some time for consumers to embrace this feature once it’s released, as it represents a new way of thinking about how to interact with a mobile phone.

Still, this hints at a future where you won’t need to open as many apps to do things on your mobile device, and that’s promising.

But I’d like to give it more time and talk more about it once the final version is released.

read more: “The Cambrian Explosion: Fundamentally Reimagining the Smartphone with AI” coming soon

How visual intelligence works

Visual intelligence relies on new camera control buttons on iPhone 16, 16 Plus, 16 Pro, and 16 Pro Max. If you hold down the button, you’ll see a prompt explaining what Visual Intelligence is and informing you that the images won’t be saved to your iPhone or shared with Apple.

AI Atlas Art Badge Tag AI Atlas Art Badge Tag

When the Visual Intelligence interface is open, you can take a photo by simply tapping the camera shutter button. From there, you can tap the on-screen button to ask ChatGPT about the image, or hit the search button.Click the button to launch Google Search. You can choose to use ChatGPT with or without an account. If you don’t sign in, your requests will remain anonymous and will not be used to train your ChatGPT model.

Screenshot of Apple Intelligence visual intelligence feature on iPhone displaying Gameboy Color Screenshot of Apple Intelligence visual intelligence feature on iPhone displaying Gameboy Color

I took pictures of retro game consoles and asked them when they were released. Visual Intelligence with ChatGPT got that answer right.

Lisa Yidishko/CNET

The current version of Visual Intelligence also gives you the option to report a concern by pressing an icon that looks like three dots. If you want to delete the image and take another one instead, × The icon is located where the shutter button is normally located on the screen.

On iPhone, in addition to using Google or ChatGPT, you’ll also see certain options based on what you’re pointing the camera at. For example, business hours if you’re pointing the camera at a store or restaurant.

What does it feel like to use it?

In the short time I’ve spent with Visual Intelligence, I’ve used it to find out about restaurants and stores, and to ask questions about video games.

It’s a quick and convenient way to access ChatGPT and Google, but what’s most interesting to me is how you can identify restaurants and stores. So far, this has been most effective when pointing the camera at a storefront rather than a sign or banner.

For example, when you scan the exterior of Kettl, the Japanese tea shop we mentioned earlier, Visual Intelligence automatically extracts useful information, such as photos of the various drinks. I had a similar reaction when I took a photo of a vintage video game store near my office. When you pressed the shutter button, Apple displayed the store’s name along with a photo of the store, a link to its website, and an option to call the store.

viz-intel-screenshot-1.jpg viz-intel-screenshot-1.jpg

There were no images of the drinks on the coffee shop menu, but thanks to visual intelligence my phone displayed the images.

Lisa Yidishko/CNET

Once inside the store, we used Visual Intelligence to ask ChatGPT for game recommendations based on the titles in the store and learn more about the consoles and games in the store. That answer was pretty accurate, but it’s always worth remembering that chatbots like ChatGPT aren’t always accurate.

After taking a photo of the game on the shelf, I asked ChatGPT for games similar to the Persona Dancing series, and they suggested other titles with a similar emphasis on music and story. This seems like a smart answer, since Persona Dancing Game is a rhythm-based spin-off of Persona, a popular Japanese role-playing game. To find out that the Game Boy Color was released in 1998, all you had to do was take a quick photo and ask when it was released. (By the way, I got similar results when I asked the same question on the ChatGPT app.)

Apple Intelligence on iPhone showing ChatGPT results. Apple Intelligence on iPhone showing ChatGPT results.

This answer from ChatGPT and Visual Intelligence about a game I might like was very elegant.

Lisa Yidishko/CNET

I’ve enjoyed experimenting with Visual Intelligence so far, and I feel like it could be even more useful while traveling. Being able to tap your iPhone at a landmark, store, or restaurant and learn more about it would have come in handy during my travels to France and Scotland earlier this year. In cities that are already familiar, there is less of a desire to immediately know detailed information about nearby places.

read more: What I learned from replacing my Apple Watch with Samsung’s Galaxy Ring

Visual intelligence and the future of mobile phones

It’s hard not to compare Visual Intelligence to Google Lens. Google Lens lets you use your phone’s camera to learn about the world around you instead of typing search terms. In its current form (also an early preview for developers), Visual Intelligence almost feels like a dedicated Google Lens/ChatGPT button.

Considering Google Lens has been around for years, it might not feel new or different. But this kind of functionality is so important that the fact that modern iPhones have a dedicated button for it speaks volumes. This shows that Apple thinks there may be a better way to search and get things done on your phone.

Apple isn’t alone in this belief. Startups like Google, OpenAI, Qualcomm, and Rabbit all believe that AI can unlock new ways to use cameras on mobile devices by making them more than just a way to take photos, but a discovery tool. At the annual Snapdragon Summit this week, Qualcomm showed off a virtual assistant concept that uses a camera to do things like split a restaurant bill into three installments based on a photo of your receipt.

The trick is to get the public to adopt it. I’d wager that muscle memory might prevent many people from ditching the old method of tapping and swiping to take photos, even if it were faster and more efficient.

Developing new habits takes time. However, Visual Intelligence is still in early preview and more features will be added in the future.

Apple’s iPhone 16, 16 Plus show off bold colors and buttons

See all photos





Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
admin
  • Website

Related Posts

How to block annoying texts on your iPhone with just a few taps

October 31, 2024

iOS 18.1 is about more than just Apple intelligence. Here are the bug fixes iPhone users urgently need – The Week

October 31, 2024

This week’s top spots: iPhone 16, director David Schoen, ‘Write smarter’ with Apple Intelligence

October 31, 2024
Add A Comment
Leave A Reply Cancel Reply

Editors Picks

Will Google’s new anti-theft feature be a game-changer for Android users?

October 13, 2024

Huawei’s Android replacement HarmonyOS Next launches next week, permanently discontinuing Google’s operating system on existing devices

October 11, 2024

Android 15 lets you turn your phone into a useful smart home dashboard – here’s how

October 11, 2024

Google ordered to open Android app store to competition

October 10, 2024
Top Reviews
Wtf Android
Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
  • Home
  • About Us
  • Advertise with Us
  • Contact us
  • DMCA Policy
  • Privacy Policy
  • Terms & Conditions
© 2025 wtfandroid. Designed by wtfandroid.

Type above and press Enter to search. Press Esc to cancel.