Apple’s WWDC 2024 keynote has come and gone. It was quite a memorable one, starting with an action-packed opening sequence and then drilling deep down into the new features coming to all of Apple’s latest software updates.
- What is Apple Intelligence?
- Prioritize notifications
- Writing Tools to make you a better writer
- Create your own images with Image Playground
- The next generation of emoji: Genmoji
- A new Photos app
- Siri gets a long overdue makeover
- Apple Intelligence and privacy
- OpenAI ChatGPT integration
- Can my iPhone run Apple Intelligence, and when can I use it?
One of the biggest focuses this year was on Apple Intelligence, which is Apple’s version of the AI-powered tools that are behind those new features. You may have missed all of the cool new AI things coming, so here’s a rundown of it all.
What is Apple Intelligence?
In short, Apple Intelligence is “AI for the rest of us”. While AI has been trending in the mobile industry recently, Apple could be the one to make it actually mainstream and not just something among the techies.
Apple Intelligence is a new AI system that is “comprised of highly capable large language and diffusion models specialized for your everyday tasks,” according to Craig Federighi, Apple’s senior vice president of software engineering.
“This is a moment we’ve been working towards for a long time,” Federighi said during the WWDC keynote. “Apple intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac. It draws on your personal context to give you intelligence.” The machine learning system promises to enable your mobile and laptop devices to “understand and create language, as well as images, and take action for you to simplify interactions across your apps.”
Apple covered a lot of these features in quick succession during the keynote, so it would be easy to miss a thing or two. In short, Apple Intelligence will help make your life a little easier — even with Siri.
Prioritize notifications
I’m sure that you get bombarded with notifications throughout the day — I know I do. Sometimes, it can really get out of hand, and I just have so many that I don’t know what’s actually important.
Apple Intelligence will prioritize your notifications so that the most important and crucial ones are at the top of the stack. For example, during the keynote, it was shown that time-sensitive notifications, such as dinner invites, work meetings, and an Instacart delivery, were at the top of an iPhone’s notification stack. This would also be useful for particularly active group chats.
There is also a new Focus mode called Reduce Interruptions. When this is on, only notifications that may need immediate attention will show up, such as when you need to pick your child up from daycare.
Writing Tools to make you a better writer
iOS 18, iPadOS 18, and macOS Sequoia will have new Writing Tools built in systemwide. With these new tools, users are able to rewrite, proofread, and even summarize text across a multitude of apps — including Mail, Notes, Pages, and even third-party apps.
Rewrite utilizes Apple Intelligence to rewrite what has already been written into different versions. This can help you adjust the tone to suit the audience and task at hand, such as a professional tone for a cover letter or some humor to a party invitation.
Proofread will check grammar, word choice, and sentence structure while also providing suggestions for edits (and even explanations of why that edit should be made). Accepting these edits is quick and easy, though it’s also optional — just a helpful tool.
Summarize allows users to select text and get a recap of the key information in a digestible paragraph, bullet points, table, or basic list. Summaries can also be found at the top of emails in the Mail app, along with priority emails at the top of the list. Smart Reply in Mail can identify sections of an email to help you craft a custom message in response to it.
In the Notes app, users can also record and transcribe audio. Apple Intelligence will then generate a text summary of the key points for you and display them at the top. This also works in the Phone app, allowing users to record calls for the very first time (people on the other end are automatically notified).
Create your own images with Image Playground
A big part of AI, in general, is generating images, though the ethics of that is always going to be a hot topic. Apple has this as well with Apple Intelligence, and whether you love or hate it, it looks like it’s here to stay.
The new Image Playground lets users create new and original images to communicate and express themselves in new ways. There are three styles available in Image Playground: Animation, Illustration, or Sketch. Image Playground is built right into first-party apps like Messages, but it also has a standalone app so you can experiment. The Image Playground feature is also contextually aware. In Messages, for example, it can offer suggested image creations that are related to the messages in a thread.
Apps like Notes can also access Image Playground through the Image Wand tool in the Apple Pencil tool palette. This tool can turn your sketches into full-on images or generate a new image with context from the surrounding area.
The Image Playground feature is also available in other apps like Keynote, Freeform, and Pages. Third-party developers can also choose to implement it with the Image Playground API.
The next generation of emoji: Genmoji
Unicode Consortium? Who needs ‘em? Now we have the next generation of emoji with Genmoji!
Apple Intelligence takes emojis to an entirely new level with Genmoji. In short, you can create your own original Genmoji to express yourself. All you have to do is type in a description, and Apple Intelligence gives you your very own personalized emoji, along with several other options if the first one isn’t to your satisfaction.
Genmoji can even be created based on photos of friends and family. And like regular emoji, a Genmoji can be used to inline in Messages, as a sticker, or even a Tapback reaction.
A new Photos app
The Photos app is getting a major redesign in iOS 18. You’ll find a simplified single view that has a grid for your gallery, where you can view by year, month, or all. Underneath are Collections and Memories, and you can pin your favorite Collections and customize what shows up that’s most important to you.
The Photos app is also getting imbued with some Apple Intelligence, much like everything else. Searching in Photos will be much better with AI smarts. For example, you can search better with more natural phrases like “Maya skateboarding in a tie-dye shirt” or “Katie with stickers on her face.”
The search capabilities powered with Apple Intelligence also extend into searching video. You can find specific moments in video clips, and the app will let you go right into the relevant segment.
Another new tool in Photos that utilizes Apple Intelligence is the Clean Up tool for photo editing. This works like Google’s Magic Eraser by identifying and removing distracting objects in the background of your photos without altering the subject.
You can also create your own story with the Memories feature by typing a description of what you want to show off. Apple Intelligence picks out the best photos and videos based on the description you used, and then you can create your own video story. It can even add music from Apple Music that matches the memory.
Siri gets a long overdue makeover
Siri, which has become the butt of jokes in recent years, has finally gotten the AI superpowers it deserves. It even has a new look with a simple glowing light that wraps around the edge of the screen when it is active.
Now powered by Apple Intelligence, Siri is more deeply integrated into the overall system experience. It has better understanding of natural language and is more conversational than before. It can even follow along if you stutter, thanks to the more advanced natural language processing (NLP). Siri maintains context from one request to another. It’s also easier to switch between voice and type input — whichever works better at that particular moment.
Siri has been given extensive Apple product knowledge, so you can even ask it how to do something on the iPhone, iPad, and Mac. With Apple Intelligence, Siri now has onscreen awareness, so it will understand and take action with user content in more apps over time. Siri can even perform hundreds of new actions across first- and third-party apps.
Apple Intelligence and privacy
One of the biggest concerns about using AI in products is privacy. Apple values privacy more than some of its competitors, and that also extends to Apple Intelligence.
Many Apple Intelligence tasks are all done on-device. For the more complex requests that need a bit more processing power, Apple uses Private Cloud Compute. This is a larger, server-based model that runs on Apple silicon, which is a foundation that allows Apple to make sure that your data is never retained or exposed.
OpenAI ChatGPT integration
Apple has partnered with OpenAI to bring ChatGPT (powered by ChatGPT–4o) access across iOS 18, iPadOS 18, and macOS Sequoia. Before any requests, documents, or photos are sent to ChatGPT, users will be clearly asked, and Siri will present the answer directly. There will be no need to switch tools.
ChatGPT will only be consulted if Siri does not have the answer for the user. It will also be available in the Writing Tools mentioned earlier.
Those who have privacy concerns about using ChatGPT should not worry, as Apple has built-in privacy protections for its ChatGPT access. IP addresses are obscured, and OpenAI won’t store requests.
While the ChatGPT features will be free for everyone to use without an account, those who subscribe to ChatGPT Plus can connect their account and access paid features directly. ChatGPT’s data-use policies only apply to users who connect their accounts.
Can my iPhone run Apple Intelligence, and when can I use it?
Certain requirements are necessary to run Apple Intelligence. For iPhone users, you will need at least an iPhone 15 Pro or iPhone 15 Pro Max, as it needs an A17 Pro chip to run.
For those who are wondering why their iPhone 14 Pro/14 Pro Max can’t run Apple Intelligence yet, it may have something to do with the amount of RAM in the iPhone itself. The iPhone 15 Pro and iPhone 15 Pro Max are the only iPhones with 8GB RAM, whereas everything else before that was running 6GB RAM. Large language models (LLMs) typically need a lot of RAM to run, which is why it makes sense that only the iPhones with the most RAM can run it right now. Apple’s Neural Engine in the A17 Pro chip is also twice as capable than the one in the A16 Pro, lending it much more NPU power.
Though the iOS 18 developer beta is out right now, with a public beta coming later, it does not currently have Apple Intelligence. Apple will begin rolling out the Apple Intelligence features as a beta later this fall.