As is customary around this time of year, Apple announced its new iPhone series last week. The promised highlight that made you want to buy these new devices was AI, or Apple Intelligence as they branded it. But the reaction from across the consumer technology world has been muted.
The lack of consumer enthusiasm was clear, and Apple’s stock price instantly eroded by more than 100 billion dollars. Even the Wired Gadget Lab podcast, which is an aficionado of new technology, couldn’t find any new features that would make you want to upgrade to the iPhone 16.
The only thing that seemed exciting was the addition of a new camera shutter button on the side of the phone, rather than the AI features. When buttons are a better selling point than the most hyped technology of the past few years, something is clearly wrong.
That’s because AI has passed what tech blog The Media Copilot calls the “wonder stage.” Two years ago, we were amazed at how ChatGPT, DALL-E, and other generative AI systems could create coherent sentences and realistic images from just a few words in a text prompt. But now AI needs to show that it can actually be productive. Since their introduction, the models driving these experiences have become much more powerful and their prices have increased exponentially.
Nevertheless, Google, NVidia, Microsoft, and OpenAI recently gathered at the White House to discuss AI infrastructure, suggesting that these companies are ramping up their technology.
According to Forbes, the industry is US$500bn (£375bn) short of recouping its huge investments in AI hardware and software, with the US$100bn of expected AI revenue in 2024 falling even short of this figure. No. But Apple still needs to aggressively push AI capabilities into its products for the same reason Google, Samsung, and Microsoft are doing it: to give consumers a reason to buy a new device.
Is it difficult to sell?
Before AI, the industry was trying to generate hype around virtual reality and the Metaverse, an effort that probably peaked with the introduction of the Apple Vision Pro headset in 2023 (coincidentally last week This product was hardly even mentioned in the announcement).
After the Metaverse failed to gain traction, technology companies needed something else to drive sales, and AI became the new shiny thing. But it remains to be seen whether consumers will embrace AI-based features on their phones, such as photo editing and writing assistants. This is not to say that current AI is useless. AI technology is being used in multibillion-dollar industrial applications in everything from online advertising to healthcare and energy optimization.
Generative AI has also become a useful tool for professionals in many fields. Research shows that 97% of software developers have used AI tools to support their work. Many journalists, visual artists, musicians, and filmmakers are adopting AI tools to create content faster and more efficiently.
But most of us aren’t actually going to pay for a service that draws funny cat cartoons or summarizes text. Especially since AI-powered search attempts are known to be error-prone. Apple’s approach to introducing artificial intelligence seems to be mostly a hodgepodge of existing features, many of which are already built into popular third-party apps.
Apple’s AI can help you create custom emojis, transcribe phone calls, edit photos, and compose emails. While useful, it’s no longer groundbreaking. There’s also something called Reduce mode, which is supposed to reduce distractions and only let important notifications through, but no one knows how well that actually works.
One of the forward-looking features is called Visual Intelligence. This allows you to point the camera at anything around you and retrieve information without having to explicitly search for it. For example, if you take a photo of a restaurant sign, your phone might tell you the menu, show you reviews, or help you reserve a table.
It’s very reminiscent of Google’s Pixel phone lenses (or ChatGPT’s multimodal features), but more real-time, interactive, and indicative of future uses of AI located in real-world environments.
With this enhancement, Apple Intelligence and Reduce mode could evolve into so-called “context-aware computing.” Although this has been envisioned and demonstrated in research projects since the 1990s, it is still not sufficiently robust for the most part. Actual product category.
The key to all of this is that Apple Intelligence isn’t really ready for anyone to try yet. That’s because new iPhones don’t yet include Apple Intelligence. Perhaps they will turn out to be more valuable than the limited information suggests. But Apple was once known for only releasing products when they were truly ready. This meant that the use case was very clear and the user experience was perfectly honed.
This is why the iPod and iPhone are so much more attractive than all MP3 players and smartphones released before them. No one knows whether Apple’s approach to AI will be able to recoup some of its lost stock value, not to mention the hundreds of billions of dollars it and the rest of the tech industry have invested. After all, AI still has great potential, but it may be time to slow it down.
Take a moment to think about where it might actually be most useful.
Lars Erik Holmquist, Professor of Design Innovation, Nottingham Trent University
This article is republished from The Conversation under a Creative Commons license. Read the original article.