I’m not a Mac user, but after hearing about Apple Intelligence at WWDC 2024, I might become one. This AI powered suite is along the lines of Microsoft Copilot+, touching every aspect of the Mac, iPhone, and iPad to provide AI assistance. The AI market is already saturated with options, but with Apple Intelligence, I have to admit — I’m already hooked.
Instead of building the Mac around AI, Apple is building AI around the Mac. It’s a systemwide utility that makes the Mac much more useful overall. Here are the Apple Intelligence features coming to the Mac, and why they have me so excited.
Personal context
Apple dedicated a fair amount of screen time during its keynote to understanding language and personal context. Without those two things, large language models and AI assistants are little more than a sum of what we tell them. For instance, starting a new conversation with Bing Chat or ChatGPT is a total blank slate, and the AI can only provide you with responses based on the data it was trained on — or by searching the web. That’s not going to be the case for Mac users going forward, though.
Adding personal context and access across many apps is what truly makes the AI “pop” and become something akin to a virtual assistant. To that end, Apple is introducing Apple Intelligence on the Mac, but also to iPadOS and iOS. Starting from things like prioritizing the right notifications and leading to more natural conversations with Siri, Apple Intelligence is meant to make AI more approachable for users.
Apple Intelligence enables various apps on your Mac to work together in order to provide the right context when you need it. During its demo, Apple showed how the Apps Intent feature can help Siri access and move information throughout your apps. It’s not just limited to Apple apps, either — developers can use the API to add support to their own apps. As a result, you can ask Siri about when someone’s flight is scheduled to land, and it will look through your emails or messages to provide a thorough response. Follow that up with directions to the airport, and you won’t have to elaborate as to which airport — Siri already knows.
Much like Microsoft’s Copilot, Siri can help you find your way around macOS, find a good pub to go to on the weekend, or search through your emails to pull up the highlights. It can also work across multiple apps at the same time; for example, if you ask the AI to find and play that podcast that someone sent you a few days ago, it will dig through your messages and launch Apple Music (or any other app) to play it. All of this comes down to relying on context.
Some of this isn’t new. However, the difference is that you won’t necessarily have to buy a new device in order to try out the power of personal context in AI, as these updates will be available for free, for everyone who has a Mac with an M1 chip and later.
Image tools
While not exclusive to the Mac family of devices, Apple’s image tools excite me for reasons that are silly, but also very human. As a person with zero art skills, the ability to create cute emojis or intricate images in-app with just a short text prompt sounds pretty awesome, and that’s what Genmoji can do.
Genmoji works in several ways. You can describe an emoji to the AI — in the demo, Apple talked about a T-rex in a tutu on a skateboard — and get a few generated images to choose from as your new emoji. The other way to use it relies on context. Thanks to being able to access your photos and contacts, Genmoji can help you create personalized images of your friends and family and send them over at any given time. Images can be created in three styles: sketch, illustration, and animation.
Moreover, AI can now help you turn a rough sketch into an AI-generated image.
Writing tools
Built into macOS 15, Apple Intelligence can help you write emails, edit your blog posts, or summarize long documents all within the app that you’re using. The AI can rewrite, proofread, and summarize text for you, as well as create a table of contents. It can also help you change the tone to friendly, professional, or concise.
I know, these features don’t sound groundbreaking. It’s nothing that ChatGPT, Copilot, Gemini, or even Grammarly can’t already do. Again, the benefit here stems from the fact that it’s systemwide.
Let’s say you have a pesky co-worker that you’ve drafted an email to, but upon rereading it, it sounds way too blunt for you to be able to send it. Instead of taking that deep breath and starting from scratch, you can simply use Apple’s built-in AI to turn that email into something friendlier. Without it, you’d have to paste the letter into your tool of choice, do a little back and forth with it, and then paste it back to finally send it.
Bigger than it seems
With tools such as ChatGPT readily available, and pretty much every big tech company going all-out on AI, some of the announcements I talked about above are starting to feel like old news. However, with the way Apple’s integrating AI into its ecosystem, these things that we’ve already seen before are finally starting to feel natural. Although I’m a Windows user, I’ve never been more tempted to give Apple a proper try, because the way these features all weave together feels much more seamless than what I’ve seen from Apple’s rivals so far.
If Apple announced all of this two years ago, we’d have been floored. Now, the impact of things such as generative images or proofreading is relatively small, and even the way Siri can now act as your go-to across every Mac, iPhone, and iPad may not sound like a big deal — but it is. With Apple Intelligence, we’re seeing the first iteration of systemwide AI that acts across multiple devices and apps without being a headache to navigate. If you ask me, that is good news.
When Apple said “AI for the rest of us,” it meant business. I can’t wait to see all these new AI updates start rolling out later this year, with an official launch planned for sometime in the fall.