Key Takeaways
- Apple Intelligence integrates AI across multiple apps for personal and cross-app functionality.
- Apple refrains from buzzwords like AI, focusing on “personal intelligence” features.
- Unlike Google Gemini, Apple Intelligence emphasizes on-device use and privacy.
What started as a sprinkle of intelligence-based announcements during Apple’s annual WWDC on June 10 turned into a downpour of AI-powered features as the company closed the keynote with Apple Intelligence. Unlike the Gemini app on the latest Google devices, Apple Intelligence isn’t a dedicated app that’s coming out on the latest iOS, iPadOS, and MacOS updates. Instead, Apple Intelligence works across multiple apps while drawing on each users personal context.
During the event, Apple refrained from even using the AI buzzword, instead largely referring to Apple Intelligence as “personal intelligence.” While Apple Intelligence will power some features that are similar to Google’s Gemini such as the new organization and search tools coming to Photos, much of Apple’s new AI features remain fundamentally different than the latest outputs from Google Gemini because of the cross-app functionality and the focus on “personal intelligence.”
During the event, Apple refrained from even using the AI buzzword, instead largely referring to Apple Intelligence as ‘personal intelligence.’
What does Apple mean by personal intelligence? For starters, Siri will understand context, enabling you to ask follow-up questions or ask the voice assistant to cary out related tasks for you. The AI will also power things like priority notifications in both Mail and Messages, placing the most important notifications at the top and also choosing which notifications to send through when a Focus mode is activated.
But, for users miffed by those differences, Apple is uncharacteristically launching support for external models, beginning with ChatGPT later this year. The ChatGPT integration using GPT-4o will, like Apple Intelligence, work across apps like Siri and system-wide writing tools like Notes and Pages. Apple said additional platforms will follow, though did not specifically mention Gemini by name.
“We’ve been using artificial intelligence and machine learning for years,” Apple CEO Tim Cook said during the presentation. “Recent developments in generative intelligence and large language models offer powerful capabilities that provide the opportunity to take the experience of using Apple products to new heights. So as we look to build in these incredible new capabilities, we want to ensure that the outcome reflects the principles at the core of our products. It has to be powerful enough to help with the things that matter most to you. It has to be deeply integrated into your product experiences. Most importantly, it has to understand you, your relationships, your communications, and more. And of course, it has to be built with privacy from the ground up.”
WWDC 2024: The 7 biggest announcements from Apple’s developers conference
Apple Intelligence, iOS 18, iPadOS 18, macOS Sequoia, watchOS 11 and more — this year’s event was packed with announcements.
Apple Intelligence is Apple’s take on AI
It strives to understand context
Apple/Pocket-lint
Announced on June 10, Apple Intelligence is Apple’s take on AI using generative intelligence and large language learning. Rather than existing as a standalone app and web browser platform, Apple Intelligence works across multiple apps on compatible devices from the company. Apple users can find Apple Intelligence inside Pages and Notes for summarizing or rewriting content, inside Messages for creating AI-generated custom emojis or Genmoji, and inside the voice Assistant Siri, to name just a few of the places where Apple Intelligence will be integrated. While Apple Intelligence works across multiple existing apps, it will also power new experiences, including Image Playground, which is Apple’s take on generative AI images.
Apple Intelligence: 12 AI features coming to your iPhone and Mac
At WWDC 24, Apple announced some major AI upgrades. Here are all the new Apple Intelligence features coming to Apple’s devices.
How Apple Intelligence is similar to Gemini
Multi-app integration, Photos and homework help feel similar
Apple
While Apple’s announcements came with a sprinkle of AI throughout followed by an onslaught at the end, Google I/O was heavily focused on AI throughout the entire keynote. While Apple Intelligence and Gemini have several fundamental differences, there are a few similarities.
First, Apple Photos will soon feel more like Google Photos. Searching for a specific photo or video, editing photos with one tap, and removing distractions from the background are all AI-powered tools coming to a drastically redesigned Photos app on iOS, iPadOS, MacOS and VisionOS.
Some of Apple’s announcements also felt similar to the homework help on Gemini. A calculator app is finally coming to the iPad. But, with Apple Pencil, the app can calculate from hand-drawn equations and even integrated graphs. Those with an iPad and Apple Pencil will also be able to use Smart Script, which keeps your handwriting yours but cleans up the shaky lines and even allows you to copy and paste text in your own handwriting.
Integration is also a key feature for both Gemini and Apple Intelligence. Where Gemini is integrated into Google Workspace, Apple Intelligence can similarly summarize or rewrite text in Pages, Mail and other writing apps, while also working in Photos, Messages, Mail and more.
ChatGPT will be baked into iPhone and Macs for free later this year
Apple is partnering with OpenAI so that iPhone and Mac users can leverage GPT-4o directly on their devices.
How Apple Intelligence is different than Gemini
Apple Intelligence is built to run on existing tools rather than entirely new ones
Apple
Apple’s presentation was arguably less exciting than the demonstration of how Gemini will soon be able to interact with the world through real-time video. Apple Intelligence doesn’t have the bells and whistles of Project Astra and doesn’t include real-time video. But while Apple Intelligence feels less groundbreaking, Apple’s platform feels more tuned into usability.
Apple Intelligence instead builds more on existing tools rather than creating entirely new ones as Gemini is doing with Project Astra. Without Gemini’s upcoming ability to interact using live video, you can’t solve a math problem with a video. But, you can soon use the Apple Pencil to solve handwritten math equations in Calculator or turn your handwritten notes into your own custom copy-paste “font,” both features coming to iPad.
Apple Intelligence is also focused on integrations into apps that Apple fans already use. You don’t need to open up a separate app, you can use the AI right inside Pages and Notes, for example. But users can also take action across multiple apps and ask the AI to handle some tasks for them in more natural language, like “play the podcast that my wife sent me the other day.” Yes, Gemini has similar integrations into the Android Platform and Google Workspace, but unlike Gemini Apple Intelligence isn’t a stand-alone app.
Apple Intelligence is also designed to remain on the device from the start, where Google’s Gemni has different tiers, the smallest of which is designed to be used on mobile devices without accessing a network. Apple’s on-device intelligence is a nod to privacy, keeping photos and personal data from ever needing to leave the device. While there are some instances that a network connection is needed, Apple says the software will notify the user before sending the data off device for Private Cloud Compute. And in those scenarios, Apple says that data isn’t stored or made accessible to the company.
I tested Gemini Advanced against ChatGPT Plus to see which AI is better
I conducted a Gemini Advanced vs. ChatGPT Plus face-off, because I wanted to know which AI chatbot subscription service is actually best.
Apple Intelligence may be more ethical than Gemini
Artists and creators will be less miffed with Apple’s version of AI
Apple
While the list of Apple Intelligence features feels less groundbreaking than Gemini or even ChatGPT’s latest launches, how Apple’s list of AI tools is integrated feels both more useful and more ethical. Apple Intelligence feels more like a personal assistant than a tool that could outright replace some careers. The list of Writing Tools notably doesn’t include an option to generate new text. Instead, Apple’s demonstration focused on using AI to rewrite what you wrote, proofread it, or summarize it. Asking AI to proofread a cover letter for a job feels more ethical than asking AI to write it for you entirely from scratch.
Apple’s image generation platform is also more geared towards social interactions than artwork for walls or publications. Creating custom Gemojis, for example, feels less invasive than using AI to hang a new piece of artwork on your walls instead of hiring an artist. With generative AI’s inability to generate things like hands or text, Genmojis and custom images for Messages feels like a more useful way to use the imperfect technology.
I tried NotebookLM with Gemini 1.5 Pro and it’s better than ChatGPT
I didn’t have to guess where the AI pulled information from, which made fact-checking and research so much easier.
Apple Intelligence is limited to newer devices
Apple Intelligence is only for certain iPhones, iPads and Macs
Apple
While the on-device AI is a pro for privacy, the move severely limits who can access Apple Intelligence. Only Apple’s latest chips are heavy hitting enough to handle the demands on on-device AI, which means the brunt of the features are only rolling out to those with the latest iPhone 15 Pro, iPhone 15 Pro Max and Macs and iPads with an M1 or later chip. The Apple Intelligence features will roll out to these devices in beta sometime this fall.
Trending Products