The company sells a number of Apple Intelligence features that are not yet available, but Visual Intelligence seems to be the most promising. The idea was that you could point to a brand new No matter what you see, the iPhone 16 will tell you everything about it. As it turns out, this Visual Intelligence is just an API call to Google or ChatGPT. In short, nothing new. bloomberg’s Apple insider Mark Gurman says it probably took Apple at most a week to implement the feature. It probably took longer than that, but it probably wasn’t that difficult, although software development often presents unexpected complications.
Apple touts visual intelligence as an amazing next step in the iPhone’s usefulness and practicality, but it’s just a front for the services you already have access to. Apple Intelligence does the heavy lifting here, at least for now. That could change as improved Siri and other AI features roll out next year.
Apple Intelligence promised a lot, but we’re still waiting. |Video credit — Apple
If one of the iPhone 16’s best-selling features is something anyone can do on any platform, the rest of the features awaiting release are even more disappointing. Especially over the past year, Apple has Malicious software updates are carried out one after another.
The average consumer now looks forward to ChatGPT integration all the time iOS18.2 has been released. This will definitely make Apple Intelligence a little more appealing and make in-store demos easier for Apple employees, but it’s hardly going to deliver on what we promised. I guess it remains to be seen whether the new Siri and on-screen recognition will live up to the hype, or just an old tool in a new package.
Apple, please try harder. or The iPhone 17 is probably not as ambitious a project as you claim.