For the last few years, Apple’s macOS releases have been interesting, if not particularly exciting. But that’s all set to change this year with the launch of macOS Sequoia, and it’s all thanks to one feature: Apple Intelligence.
Apple’s artificial intelligence (AI) platform has the potential to completely change how you use your Mac on a daily basis. From generating images, rewriting emails, and summarizing your audio recordings to revamping Siri into a much more capable virtual assistant, Apple Intelligence could be the most significant new macOS feature in years.
Now that it’s available in the latest macOS Sequoia beta, I thought I’d take Apple Intelligence for a spin to see whether it’s worth your time. Read on to see my first impressions.
Siri
The good news is that you can try out Apple Intelligence on your Mac for free by downloading the macOS 15.1 developer beta. The bad news is that a large number of Apple Intelligence’s features — the majority, in fact — are currently missing in action and unavailable to test out. So, while I was able to try a few parts of Apple Intelligence, it was far from a complete look at the new system.
Let’s start with Siri. Apple’s virtual assistant has squandered its early lead since it launched in 2011 and has fallen far behind its rivals in recent years. Apple Intelligence is a massive chance to close the gap, injecting the assistant with a much-needed dose of attention (and, you know, artificial intelligence).
Unfortunately, many of the new Siri’s features are evidently not ready, and Apple hasn’t added them to macOS Sequoia yet. Awareness of a question’s context — as well as what’s happening on your screen — is totally absent, for example, and it’s the same case for its ability to run functions inside other apps, among other features.
With that in mind, what about the remaining new Siri features that you can try out? One of the additions that Apple talked up this year was Siri’s ability to understand you if you change your mind or stumble over your words. The idea is you can say something like “Siri, set a timer for 10 minutes … no, five minutes. No, three minutes” and it will know to set a timer for three minutes and disregard your earlier instructions.
Except this doesn’t work very well at all, at least not in my testing. Every time I spoke this phrase, the Siri window showed that it was accurately logging what I was saying — the displayed text was correct. The problem is that in several attempts, it barely ever set the correct timer. Instead, it set a five-minute timer, an eight-minute timer, and, most bizarrely, a timer for six hours, 32 minutes, and 18 seconds. More often than not, it misunderstood how long I wanted the timer to last, often selecting a seemingly random duration instead of what I’d asked for. Clearly, more work is required here.
What else is new? Typing to Siri is much easier than before — just select the Siri icon in your Mac’s menu bar, and you can type to the assistant right away. Previously, you had to dive into your Mac’s accessibility settings to enable this. This change makes Siri much more useful for Mac users, as you might find yourself working in a library or coffee shop and don’t want to disturb other people by talking to Siri. Strangely, though, I could use the Type to Siri feature even though it was apparently disabled in my accessibility settings. Either way, it’s a handy addition to macOS.
Siri also has a new user interface, with glowing edges that pulse as it works on your query. But it still feels very limited since most of its features are still absent from the macOS Sequoia beta.
Writing Tools
When you think of artificial intelligence, writing tools are probably one of the first things that come to mind. You know the type: getting AIs like ChatGPT to rewrite text you feed it or summarize the copy on a web page.
Now, macOS can do that sort of thing natively. And unlike Siri, this feels a lot more fleshed out. Somewhat predictably, Apple calls this feature Writing Tools. Just highlight a few lines of text, right-click it, and you’ll see the Writing Tools menu item. From here you can proofread or rewrite the text; make it friendly, professional or more concise; or summarize it, create a selection of key points, turn it into a list, or transform it into a table. There’s also an option to open a small Writing Tools window next to your text.
These tools are generally pretty good, whether you want to rephrase your text, get a summary, or make it easier to understand. In particular, I can see these tools being useful for writing formal or important emails, or for drafting documents before you make later edits.
I don’t think it’s the sort of thing I’ll be using every single day, but it might be helpful to get some ideas for rewriting my words every now and then. The text manipulation tools (summarizing, turning text into a list or table) feel a bit more useful to me, though, especially when I’m confronted with a large wall of text and just don’t have the energy to read through it all.
The best part of the Writing Tools is that they are not limited to one app. Instead, they work all over the place, including in some of the best third-party Mac apps. That extends their usability considerably and ensures you should be able to use them where you feel comfortable, rather than having to switch apps (and thus potentially interrupt your workflow) in order to utilize them. With that being the case, it increases the chances you will actually employ these tools on a regular basis.
Transcription
Another useful Apple Intelligence feature that has made it into the latest macOS Sequoia beta is audio transcription. This comes into play with phone calls and sound recordings, and in both cases Apple Intelligence will try to generate a summary of what was said for you.
I tried it out on a 38-minute audio recording in the Voice Memos app. Getting started is simple: You just select the recording you want to transcribe, then select the speech bubble icon in the app’s top-right corner. For my 38-minute recording, Apple Intelligence was able to create a summary in about 25 seconds.
Unfortunately, the summary was peppered with wrongly transcribed words throughout. That’s not so unusual — even the best AI-powered audio-transcription services will make mistakes. But what is annoying is the fact that you can’t correct any of these errors, as there’s no way to edit the text. Apple Intelligence’s output also doesn’t separate different speakers in a recording, and since you can’t do it yourself, it’s usefulness for transcribing calls and interviews is limited. It’s OK for quick-and-dirty transcription, but you’ll want either a professional worker or a better AI tool for more important tasks.
That said, using the Writing Tools feature to summarize your transcriptions is more fruitful. My 38-minute recording was summed up in four paragraphs by Writing Tools, and it generally got the gist of what was said and included most of the salient points.
For now, Apple Intelligence’s transcription tools remain a work in progress, but there’s definitely potential there as long as Apple can refine the rough edges.
What’s missing?
That’s more or less all the Apple Intelligence features that are available in macOS Sequoia right now. But there are plenty more waiting in the wings that have yet to be released, with predicted launch dates ranging anywhere from shortly after macOS Sequoia’s launch date to well into 2025.
Image-creation tools like Genmoji (custom emoji you can generate yourself), the Image Playground picture maker, and Image Wand (which transforms rough sketches into polished drawings) are all missing. That’s also true for Apple’s ChatGPT integration and the priority notifications feature that only lets the most important alerts reach you immediately, shutting out most other distractions.
I’ve already mentioned that many Siri features aren’t ready yet, from on-screen awareness to controlling other apps. Apple’s Mail app is also due for a redesign, with new sections that automatically triage your mail for easier management, while smart, AI-assisted replies are coming to Mail, Messages, and more. Right now, they’re all nowhere to be seen.
Finally, AI-powered features in the Photos app (like searching for images using natural language and airbrushing out background elements from your pictures) might be present, but I was unable to test them. Every time I tried to open the Photos app, it simply crashed.
In a way, that itself neatly sums things up for Apple Intelligence in macOS Sequoia’s latest beta. Right now, there are a few things to try out — and many of them are pretty good so far — but there’s a whole lot of features that are in rough shape or downright absent. There’s a lot I’d love to test, but for now, we’re just going to have to be patient with Apple Intelligence’s slow drip feed over the next few months.