Apple’s Monster iPhone update: How Apple Intelligence works in iOS 18.1
Apple’s Monster iPhone update: How Apple Intelligence works in iOS 18.1
After a month of public beta testing in a secret lab. Lo and behold, Apple Intelligence on the iPhone is alive. Well, it’s kind of lively. So it’s still an incomplete experience. A lot of stuff is being held back for general release and developer testing, including flash features, but it will be a while before it’s available to the public. You might call it a patchwork job when Iowa’s 18.1 rolls out in the next few days. These are a bit of an eerie time for Apple. The company is boldly working on generative AI with Siri, but it’s not a finished product. I mean, Apple even goes so far as to say that it’s not a finished product and that’s why there will be criticism. My own experience with the public beta left me with a few questions, including whether having my eccentric friend’s text messages summarized in this impersonal way would really improve my life. Now that another software update is available for download, let’s take a look at what we can expect from Apple Intelligence. I’m Bridget Carey. This was in June. Apple will first announce all the coolest features coming to Apple Intelligence, and you’ll probably hear about people testing some of those cool features on their iPhones, but the public People still can’t get all these cool features right away. 18.1 comes with several new features, including Apple Intelligence, but the only iPhones to take advantage of AI are the iPhone 15 Pro models and the new iPhone Sixteens. Now, when it comes to Mac computers, anything with Apple silicon gets Apple Intelligence. That is, anything with an M-series chip. Apple’s AI will also be included in the iPad with the M chip, and since it has the same A 17 Pro chip as the iPhone 15 Pro, it will also be included in the latest iPad mini. It’s currently only available in the US and in English, but more countries will have access to it later this year. When you encounter something that uses Apple Intelligence, you’ll notice a rainbow of lights, similar to how Siri lights up the entire border of your phone. Now I’m using it on this iPhone 16 and my Frankenphone with the beta version of the operating system. And the first public release includes three big areas of change and how it interacts with your iPhone. It’s all about how you write images and how you talk to Siri. For example, to highlight text while typing. Apple can make suggestions to improve the tone of the messages you’re writing. Now I know how I want to convey my thoughts. So this isn’t necessary, but it’s a good idea if you want to think twice about how you express something sensitive. However, I think that using these tools can sometimes give an impression of being stuffy. In some cases, you can even configure AI to summarize emails and notifications. I turned everything on just to see an overview of everything, but it’s a little weird to know that your computer is stripping the personality of your friend’s messages and delivering them to you in this admin robot-like way. You may feel it. This is an example of a group chat. He shared a photo of his son enjoying a car ride with his father, which led to him sharing a photo of the child reaching for the hood of the car. got it? Or when my editor told me he had shortened the draft of the video to 17 minutes, and I had some leeway, but I needed an input assembly timeline that was shortened to 17 minutes, and I wanted to cut out important parts. I had to request it. This happens when you have long messages or multiple messages from one source, such as Slack or iMessage. All information is condensed and can be expanded quickly. However, turning it on can also give you a dystopian vibe. You may have seen the recap shared on X earlier this month that went viral. It was a New York City software developer who got a summary of how he broke up with his ex-girlfriend. Well, I tested it and it wasn’t too bad, but it inspired me to send a bunch of messages to my colleague Lisa at Chico. Well, it didn’t contain all of my ramblings to her. So be prepared for someone to say they missed the details of your message. If you send something wrong, it means that you should act carefully if you are in a relationship. The second big change is photography. You can try searching for photos in various ways. Maybe you want to be a little more conversational or explain what you’re looking for in a photo. One of the great things about it is that you can create memorable movies just by responding to your requests. However, not everything may be perfect. But it’s pretty impressive. When I wanted to delve into the past, I asked for videos of children crawling, and various images were also shown. I asked for a video of my kids learning how to walk and found that I could tell the difference. Now, of course, there are limitations when you request photos of your kids at Legoland. I didn’t know what Legoland was, but they understood when I wanted to show them images of children playing with Lego blocks. It can also be fun to organize your photos. Well, I took a cool photo of my son dressed as Link. I wanted to erase that person in the background. Just circle the area you want to erase with your finger and a typical tree or grass image will be generated as a filler. where the person was. You may also find yourself learning different ways to communicate with Siri requests. You might be lucky if your follow-up questions go smoothly. Or you may find that Siri has no problem answering any questions you may have. It still understands you. You can also easily type into Siri by double-tapping the bottom of your phone to see a text prompt. So, all you have to do is type your question, but be prepared that Siri won’t be able to do everything for you. You just expect it to know how to do it. now. Mine is still in beta, so twists are to be expected, but I got a little bold and tried asking to see Bridget’s photo. Then you’ll see bing search results for Bridgerton. Yes, all of the work we talked about about Apple’s big pitch is in progress. The new assistant will be more personal. You can piece together facts from your calendar and email messages to provide these customized answers. But that’s not the case. Now, in an interview with the Wall Street Journal, Apple Vice President Craig Federici said that serious improvements will be phased in over the next year. This is a huge step forward and we want to get it right. Yeah, sometimes you put something out there and it messes up, and Apple’s mindset is like, get each part right and release it when it’s ready. Perhaps the biggest question at this point is what will happen to Gen Moji? When will I be able to create my own emojis? Well, the first I OS 18.2 developer beta was released this week. So while you may see this on social media or your tech-savvy friends send you some super unique emojis, this is a beta version from the developer. It’s not for everyone, and there are many other features in the developer beta. It also features chat GP T integration for knowledgeable questions from the broader World Wide Web. This developer beta also features Visual Intelligence, Apple’s version of Google Lens. Think of this as an ongoing thing. If you’re testing the beta, please subscribe so we can keep digging about this in the comments section with your thoughts. So far, we’ll have to wait and see what kind of monster Apple has created here when its intelligence agencies start roaming the land. And thanks to my clueless daughter, I combed through her arts and crafts supply box this morning to make this monster. I don’t know. I think he’s kind of cute.