Skip to content
Thoughtful, detailed coverage of everything Apple for 34 years
and the TidBITS Content Network for Apple professionals
22 comments

Examining Apple Intelligence

Whoever at Apple came up with the term “Apple Intelligence” must be pleased with themselves. It’s a clever way of creating a distinction between what Apple is doing with AI and the generative AI chatbots and artbots most people think of as AI.

(I wouldn’t be surprised if there were internal discussions about how “AI” might come to mean “Apple Intelligence,” but that’s no more going to happen than when Apple thought it could capture the word “email” with the “eMail” feature of the eWorld online service. True story: when my late friend Cary Lu was writing a book about eWorld in the mid-1990s, Apple told him that was the hope.)

It’s worth emphasizing that Apple Intelligence will be rolling out slowly “over the course of the next year.” Don’t expect to see everything discussed below in September with the initial releases of iOS 18 and macOS 15 Sequoia, and we could be well into 2025 before some of the more compelling enhancements to Siri arrive. Patience, grasshopper.

Apple Intelligence Versus AI

What differentiates Apple Intelligence from AI? Apple calls Apple Intelligence a “personal intelligence system,” and indeed, most of the features the company unveiled during its WWDC keynote revolve around your personal data. Apple’s insight—which fits neatly with the company’s focus on privacy and empowering the individual—is that many AI systems suffer because all they know about you is what you tell them in your prompts. Generative AI chatbots and artbots are trained on a massive corpus of material, and they return results that are statistically probable. But as individuals, we are anything but statistically probable. We are our data: our email, our messages, our photos, our schedules, our contacts.

The problem with Apple’s homegrown tools—notably Siri—focusing on personal context is that people will ask questions that can’t be answered with local information. When Siri determines it can’t respond well to a query requiring global knowledge, it will offer to pass the question off to ChatGPT, which is also available within Apple’s systemwide writing tools. ChatGPT integration is free, but ChatGPT Pro subscribers can connect their accounts—I’m uncertain why this would be a help at the moment. Apple said it plans to support other AI chatbots in the future, such as Google’s Gemini. Among the chatbots I’ve tested, I’ve had the least success with Gemini and the most with ChatGPT and Claude.

Apple Intelligence Privacy

It’s reasonable to worry about how these features will impact your privacy. Apple repeatedly emphasized the pains it has taken with Apple Intelligence to ensure user privacy. The AI-driven features can take one of three paths:

  • On-device: Much of what Apple Intelligence does will be handled entirely locally, never leaving your device in any way. That’s why the system requirements for Apple Intelligence are so steep—an iPhone 15 Pro or iPhone 15 Pro Max with an A17 Pro chip, or iPads and Macs with an M-series chip. Apple Intelligence’s processing and memory requirements are such that lesser chips aren’t sufficient.
  • Private Cloud Compute: Some Apple Intelligence tasks—the company hasn’t said which—exceed the capabilities of even Apple’s latest chips. For such tasks, Apple has built a server system called Private Cloud Compute that relies on custom Apple silicon and a hardened operating system designed for privacy. Private Cloud Compute receives only the data necessary to complete the task and discards everything after completion. Could Private Cloud Compute eventually be used for Siri requests made from HomePods, Apple TVs, and older devices, or will Apple use the system requirements for Apple Intelligence to encourage upgrades?
  • ChatGPT: Apple can’t make the same privacy promises with ChatGPT as it can for on-device and Private Cloud Compute processing, but it said that our devices’ IP addresses will be obscured and OpenAI won’t store requests. OpenAI does use content from individual accounts (not business offerings) to train its models, although you can opt out of that. It’s unclear if or how you can opt out of OpenAI training on content submitted through the Apple Intelligence integration.

Apple Intelligence Features

Apple Intelligence is an umbrella term for three classes of features surrounding language, images, and actions. Language-related features include system-wide writing tools, categorization and prioritization of email messages and notifications, and transcription of recordings, phone calls, and voice memos, along with Siri’s improved understanding of natural language. Image-focused features include Genmoji, the image-generation tool Image Playground, and advanced editing capabilities in Photos. What Apple calls “actions” mostly seem to involve enhancements to Siri that enable it to perform specific tasks, even across multiple apps.

Language Tools

The most prominent of the Apple Intelligence language capabilities may be Apple’s new systemwide Writing Tools, which will be available in both Apple and third-party apps. They’ll help you proofread, rewrite, and summarize your text along the lines of what Grammarly does today.

I’ve relied on Grammarly for years for proofreading. It catches typos, doubled words, and extra spaces, and its newer AI-powered features sometimes make helpful suggestions for recasting awkward sentences. I’m slightly annoyed that Grammarly’s proofreading tools are so helpful, but it’s challenging to edit your own text to a professional level, and Grammarly can identify errors much faster than I can. Don’t assume that tools like Apple Intelligence’s proofreading capabilities for helping with grammar, word choice, and sentence structure are necessarily a crutch. They may be for some people, but even people who care about their writing can still benefit from some suggestions while ignoring unhelpful ones. (For instance, Grammarly is allergic to the words “own,” “actually,” and “both,” but when I use them, I do so intentionally.)

It’s easier to question Apple Intelligence’s rewriting and composition capabilities (the latter of which rely on ChatGPT), but you’ll notice that most of those doing so are professional writers who don’t need them. Recall my point from “How to Identify Good Uses for Generative AI Chatbots and Artbots” (27 May 2024) that AI is useful primarily when your skills and knowledge wouldn’t already make you better than a C+ student. I do like how Apple provides three tones: friendly, professional, and concise. Less experienced writers often have trouble maintaining a consistent tone, and untold misunderstandings and hurt feelings could be avoided if people took tone advice.

Nonetheless, I’m somewhat dubious about Mail’s Smart Reply feature. Although its Q&A design answers another of my criteria for good uses of generative AI (that you must be willing to work with an assistant), it’s not clear that it would be enough faster to justify using, especially if you had to edit what it wrote to sound like something you would have sent.

Apple Intelligence’s summarization tools are spread throughout the system, and some feel like Apple is throwing spaghetti at the wall to see what sticks. Summaries seem to be associated with at least these features:

  • Text you’re writing: This seems most useful when you need a summary for a blog or social media post.
  • Notifications: If you get so many notifications that you need a summary, you may be better served by taming notifications from overly chatty apps.
  • Web pages saved in Safari Reader: Given that Reader is mainly used for pages that are too long to read immediately, summaries (along with tables of contents) could be helpful.
  • Long messages in Mail: Most email messages aren’t long enough to justify a summary, but summarization could be a boon for quickly parsing long email conversations.
  • Busy group threads in Messages: It’s hard to imagine a sufficiently involved text group thread that wouldn’t be easier (and safer) to read in its entirety, but perhaps I’m not the target audience.
  • Message list summaries in Mail: Replacing the first few lines of message text shown in the message list with a summary seems like an unalloyed win.
  • Transcripts of recordings from Notes and Phone: Given the loose nature of recorded text, transcript summaries may be particularly useful for quickly understanding a talk or call.

The final language tools evaluate the content of notifications and email messages to prioritize which to show you first. I can’t quite imagine how that will work for notifications, but prioritizing email messages should prove popular. Also, a Reduce Interruptions Focus will show you only notifications that need immediate attention. That may seem like a nice in-between option between allowing everything and turning on Do Not Disturb, but it will make the Focus feature even more unpredictable (see “Apple’s New Focus Feature May Be Overkill,” 20 January 2022, and “Notifications Unexpectedly Silenced? Blame Focus,” 17 February 2023).

Image Tools

Apple Intelligence’s image tools span the gamut. The Image Playground app (the features of which will also be available in some apps) will let you create original images from text prompts, much like other AI artbots. Apple said we’ll be able to choose from three styles: Sketch, Illustration, and Paint. That ensures that no one will be using Image Playground to make photorealistic deepfakes. I’m also confident that Apple will put significant boundaries on what Image Playground can produce—I can’t imagine it generating NSFW images, images of celebrities, or anything with trademarks, for starters.

Genmoji, which are AI-generated custom emoji-style graphics, may be more interesting for those who find emoji amusing but have trouble going beyond a few smileys. Often, when I think about using an emoji as an emotive emphasis to something I’ve written in Messages or Slack, the image I desire doesn’t exist. How else will I get a sunglasses-wearing penguin on a surfboard to express my enthusiasm for a suggested outing? Some worry that Genmoji will lack the shared meaning of the limited set of emoji we have now, but most of those shared meanings exist only among subsets of the population as it is, so it’s hard to get upset about this.

The Image Wand feature of Notes, which turns rough finger or Apple Pencil sketches into more polished drawings, has taken some flak online partly because Apple’s demo shows a perfectly passable sketch being “improved.” The criticism here would seem to fall under the same category as professional writers complaining about writing tools—it’s easy to carp if you have illustration skills. As someone who couldn’t draw his way out of a paper bag (or even draw the bag itself), I’m intrigued to see if Image Wand can make sense of anything I sketch. Nonetheless, I don’t see myself using it purely because I rarely sketch anything. I’d far rather write a thousand words.

The three remaining image-related features of Apple Intelligence are in Photos:

  • Descriptive searches: Photos has allowed us to search for objects—cat, beach, airplane—for some years, thanks to capabilities Apple previously described as “machine learning.” With Apple Intelligence, we’ll be able to search for photos using natural language: “Tonya running in a race” or “sunsets over our pond.” Once we become accustomed to the feature, I believe many of us will use it heavily.
  • Clean up background objects: Generative AI will also give Photos the capability to remove background objects from photos. (The generative part involves filling in the background seamlessly.) Those who spend a lot of time on their photos but don’t already rely on a more capable editor like Photoshop will undoubtedly appreciate the option.
  • Custom memory movies: We’ve already hit the “infinite photo” inflection point where it’s difficult to make sense of our burgeoning photo libraries. When you have tens or hundreds of thousands of images, extracting a set that’s representative of something is daunting. Generating custom memory movies with a text prompt could be compelling. I’d like to see Apple open this feature so the movies could be created and viewed on the Apple TV.

Actions, or Siri Gets a Brain

For many people, giving Siri an AI-driven brain may be the main appeal of Apple Intelligence. Although Siri was initially impressive for its time, and Apple regularly expanded Siri’s capabilities and knowledge, it seems to have been degrading over the past few years, a la Flowers for Algernon.

Most importantly, the new AI-driven Siri will have a richer language understanding and be able to maintain context within a conversation so each command won’t have to stand on its own. It should also be much more forgiving of the verbal stumbles we all experience at times.

Apple is making a big deal of Siri being aware of your personal context and on-screen content. That should enable it to find your content across Mail or Messages, extract information directly from Notes, and take action using content you can see on the screen. Its capabilities will span multiple apps, enabling you to ask Siri to enhance a photo and then attach it to a Mail message. I’m unsure how successful these features will be. Siri can do a lot right now, but because you have to know precisely what it can do and phrase the command exactly right, almost no one takes full advantage of Siri’s capabilities. It won’t take many failures—“I’m sorry, Dave. I’m afraid I can’t do that.”—before people give up on Siri again.

Because Siri will work locally on your devices, its knowledge base must be limited. In a clever move, Apple will be giving Siri knowledge about its products, so you can ask for help using your iPhone, iPad, and Mac. I’m looking forward to trying this because it can be tricky, even for people like me, to remember what any given setting does and where Apple has hidden it. (Real-world example: Why do some iPhone users see emerald rather than lime green for green-bubble friends in Messages? Because of turning on Settings > Accessibility > Display & Text Size > Increase Contrast.)

As I noted before, when a Siri query needs access to global knowledge, it will offer to send the question to ChatGPT. While that may work well for many queries, we’ll see if Apple implements it so we can maintain fluid conversations. The main problem is that ChatGPT’s knowledge is time-limited. A more satisfying approach might work along the lines of Perplexity, which performs a search and builds a response based on the summary of what it found. I could even imagine Apple moving in that direction more generally as a way of weaning itself from search-engine reliance on Google, though that would also mean giving up the billions in revenue it gets from Google.

How Smart Will Apple Intelligence Be?

There’s no question that Apple was pushed into creating Apple Intelligence. Many of its features would have worked their way into the company’s apps and operating systems over time, but the hype—some deserved, some not—surrounding AI from other tech giants forced Apple’s hand. Remember, ChatGPT only came out in late 2022, and it was months before anyone could have predicted how AI would have taken the online world by storm. Apple hasn’t had much time.

That may account for why Apple Intelligence feels like a grab bag, especially in bolted-on bits like the ChatGPT integration. Some features, such as Image Playground and Smart Reply, feel as though Apple is checking boxes to compete with existing tools. Others will be compelling, such as descriptive searches in Photos. All many people need from AI is for Siri to become less hard of understanding.

Although Apple may be behind the curve in making these features available, the company seems to have approached the architectural questions seriously. On-device processing is important for both performance and privacy reasons, and Private Cloud Compute could set a new standard for what people demand from server-based AI tools.

As far as I can tell, Apple Intelligence won’t be treading on anyone’s lawn. If you don’t want to use it, just ignore it, like all the other features that aren’t relevant to how you prefer to use technology. But I have talked with people who find Apple Intelligence some of the more exciting work Apple has done on the software side in years. Apple’s hardware has hit astonishing levels of performance, but the software hasn’t given most people new capabilities that are possible only because of that processing power.

We live in interesting times, and they may become more interesting in the next six to twelve months.

Subscribe today so you don’t miss any TidBITS articles!

Every week you’ll get tech tips, in-depth reviews, and insightful news analysis for discerning Apple users. For over 33 years, we’ve published professional, member-supported tech journalism that makes you smarter.

Registration confirmation will be emailed to you.

This site is protected by reCAPTCHA. The Google Privacy Policy and Terms of Service apply.

Comments About Examining Apple Intelligence

Notable Replies

  1. Great description of what’s coming in Apple Intelligence. I’m especially interested in using Siri to control my Apple devices reliably. I’d love to say “Siri read me today’s article in Bloomberg about Nvidia”.

    It sounds like most apps will have to be updated to support the latest Apple Intents API. I’m sure Apple apps will be updated right away, we’ll see how fast third party apps are updated.

    This could affect the app economy. Many apps depend on ad revenue. If no one sees the app on-screen because they’re using Siri voice control instead, in-app ads could die, and even more apps could switch to subscriptions.

  2. I don’t care about ChatGPT (and if I did, I would just go to chatgpt.com right now without first needing to spend >$1k for a new 15 Pro) or crowdsourcing data or any of that. All I want is for Siri to understand me better (as in, get what I actually want) and for it to gain the right hooks into the built-in apps I routinely use so it can actually help them help me.

    So one day I want this to work:

    Hey Siri, stop navigating to Gott’s and instead get me directions to Bongo Burger on Euclid, but make sure you stop me at an ATM on the way there. … No wait, I have enough cash, forget the ATM. After Bongo get me to West Coast Sporting Goods without taking the freeway.

    Every 12-year old would understand what I’m trying to do and Maps can do such navigation just fine right now (albeit requiring a whole bunch of taps and typing which I’m not going to do while driving), but present-day Siri is hopelessly lost with that. All I’d need is for Siri to get what I’m saying and then be able to set up Maps to get that done. If come September I get iOS 18 and it properly deals with the above, I will be ecstatic.

    OTOH if that still doesn’t work but instead I’ll be able to write some silly nonsense and get ChatGPT Apple Intelligence to make it sound smart (while still being wrong), again at the expense of at least a new iPhone 15 Pro, I’ll be excruciatingly meh. Your move, Apple.

  3. An interesting article. And I’m looking forward to trying Apple Intelligence features. I was somewhat disappointed to read elsewhere that only the very newest iPhones will support all the features and that even my iPhone 13 Pro is too old for some of them. However my iPad M1 from 2021 is apparently ok.

    It is funny how Apple is trying to hijack the abbreviation AI. :slight_smile:

  4. So what is better with Apple Intelligence compared to using ChatGPT directly? I also have enough options for image creation without the silly restrictions of Image Playground.

    Siri needs to be improved. Especially in mixed languages which I as developer use all the time Siri fails badly. The Apple devs can’t even fix simple problems. In Mail I still get reminders I did not set every couple of weeks or so.

  5. @doug2 Ben Thompson speculates that Apple’s LLM requires 8 GB to run. If you look at the phones that support Apple Intelligence, the cutoff is 8 GB of memory.
    John Gruber asked Apple VP of AI John Giannandrea about this on The Talk Show and he didn’t deny it.
    Every Mac with an Apple Silicon processor has 8 GB or more.

  6. Thanks for the article. The things that interest me particularly are the transcript features (lots of oral history interviews that I don’t have to listen to word for word), the summarize features (triaging scholarly articles that don’t have abstracts), the automating searching and sorting features in Photos & Mail, and a better Siri.

    If Apple can take advantage of AI and integrate thoroughly but in a disciplined way into their systems, I think this will be very helpful indeed.

  7. The focus on your personal context—the AI chatbots are very good at global knowledge and very poor at knowing anything about you that would inform or improve the results.

  8. I’d love to have location relevant siri requests: in Hawaii where I live, Siri is hopeless with any address, directions, or anything not spelled the “siri way”. I’m sure this would work with many first nations locations as well. I’m also aware that everything everyone everywhere that has used siri has been recorded, giving them a data advantage much like what Tesla has with so many cars driving on so many roads, uploaded every night. This could make “local siri” much more intelligent, as it knows how you speak, to whom, and even in different languages. Hopeful, yet still I have may texts that read “siri sucks”.

  9. Ken

    While ChatGPT can write code, I’m hoping Siri/AI will be able to create Shortcuts. I find it maddening trying to figure out how to create what IMHO should be a simple Shortcut.

  10. In that interview Giannandrea said:

    “So these models, when you run them at run times, it’s called inference, and the inference of large language models is incredibly computationally expensive. And so it’s a combination of bandwidth in the device, it’s the size of the Apple Neural Engine, it’s the oomph in the device to actually do these models fast enough to be useful. You could, in theory, run these models on a very old device, but it would be so slow that it would not be useful.”

    which makes it sound a lot more like performance than RAM. That said, I also think RAM is Occam’s razor here. I wonder why Apple is beating around the bush on this one. Perhaps they’ve heard all the snark for years about skimping on RAM they didn’t want to admit, that snark or not, those criticizing their skimping were not wrong, being essentially more forward looking than they themselves had been. Federighi at least actually did appear more willing to “hint” to RAM, according to this MR piece.

    The interview was really meh though IMHO. Gruber asked:

    “So it’s not a scheme to sell new iPhones?”

    to which Joswiak answered:

    “No, not at all. Otherwise, we would have been smart enough just to do our most recent iPads and Macs, too, wouldn’t we?”

    (good answer)

    But heck, lobbing softballs much? I mean, seriously, Gruber, what the heck would you expect their answer to such a question would be? “Ah shucks, you got us. Sorry man. But hey, can’t blame a guy for trying, right?”. Not exactly iJustine levels of brown-nosing here, but sure starting to feel a bit close. LOL :laughing:

  11. Adam’s assessment is very helpful and sober. It tells me what I wanted to know, and he understands why professional writers are critical. I’ve been making my living writing about science and technology for decades, and yes, I’m critical.

    I agree that Siri has been a disappointment and seems to have gotten worse. I was a fairly early adopter of computers, so I learned to search by looking for particular names or words, which a simple word search can do cleanly and easily. That’s what I want from a search, and Siri can’t find things I know are somewhere in my Mac’s memory. I have the same problem using Google and other search engines on the net, and it’s getting worse. With Google, it’s painfully obvious that their searches are slanted to their advertising business. With Siri I think it wants a sentence rather than a name or quote to search for. I don’t think AI will get this better unless it allows you to request a particular type of search, and I don’t know if that’s possible.

    One thing I would worry about with writing tools is that they are likely to be “one size fits all,” with a limited number of modes of editing. I have written books for kids and I have written textbooks on lasers and fiber optics, and I write very differently for the two audiences. Likewise, a physician would use professional terminology talking with another doctor, and speak or write differently to patients. So Apple Intelligence tools may be useful in dashing off notes to friends and family, but I would not expect them to be suitable for editing a scholarly paper or a big report on your big project for your boss.

    It’s going to be interesting to watch, but the current hype level worries me. Look at the growing skepticism about self-driving cars, which have fallen far behind original projections, and which still keep running into fire tricks with their lights flashing, which is an AI failure.

  12. Yes and no. The chatbots are pretty good at tailoring their output to different styles and reading levels now, but that’s for generated text. But I’ve noticed that Grammarly’s editing suggestions tend to be in the same tone, so some effort or settings may be required to get them to change tone.

    I’m becoming quite put out with both extremes. No, AI won’t be taking your job, and no, Boston Dynamics robot dogs armed with guns won’t be roaming the streets. It doesn’t do anything on its own.

    Simultaneously, every prediction Elon Musk and Ray Kurzweil make seems to be cluelessly optimistic or, in Musk’s case, more aimed at trolling a competitor or the world in general.

    AI is just technology, and like all technologies, it will be used by humans for good and ill.

  13. I’m with you on that.

    I would much rather see clear-eyed impartial analysis than doomsayers or singularity optimists. And we need to get serious about climate change and the impact of the high energy demands of large-language models.

  14. I’ll believe when I see it. Developing code WELL requires a deep understanding of the problem, and generally, AI is not yet at that point of understanding.

    Maybe AI can “write code” today, but if based on LLMs, it’s probably extremely buggy. Code that compiles, does not mean the code works, or does anything useful, or meaningful.

  15. Never forget what an LLM is actually doing. It has no understanding of your problem, nor is it “writing” anything.

    It is (effectively) using a massive database of probabilities in order to generate a sequence of words and phrases that are statistically likely to follow each other.

    So if you ask it to write code (or anything else), you will only get something that was previously written by a human, or a mash-up of stuff written by multiple humans. But there’s no guarantee that the result will be correct, or even coherent.

    So if you ask an LLM to write code, you’d better review that code really carefully before trusting it with anything. And I would argue that the effort needed to do that is likely going to be more than the effort you’d need to write that code yourself (assuming you’re a good programmer, of course).

  16. In total agreement !!

  17. What question would you have asked that would have gotten a different answer?

  18. This is one of the reasons I never try to get interviews with Apple executives. They’re not going to answer anything they don’t want to, and they’re highly practiced at dodging any such questions. So I’d doubt that anything new would come out of such an interview.

  19. Agreed. Most corporations (including the one I work for) have policies that only people specifically authorized to speak with the press can participate in interviews like these.

    The only way a reporter is going to get “realfacts” and not “goodfacts” is if he can interview someone not in a PR position. We sometimes see reporters able to pull this off, generating impressive exposés (even if their legitimacy may be highly suspect), but I think it’s fair to say that at a company like Apple, someone even giving the appearance of giving such an interview is going to be summarily fired.

    So I would assume that you’re only going to get these kinds of interviews from disgruntled employees who were planning on quitting anyway - which is hardly going to be an unbiased opinion, even if it is the opposite of the official (and therefore also biased) opinions.

  20. That’s been my experience. ChatGPT has been very good at “suggesting” approaches I had not considered, and I have found that helpful on many occasions. However, the actual code it presents often doesn’t even compile, and when it does, I don’t recall it ever actually solving the problem presented as a prompt.

  21. Speaking as a veteran reporter, back in the 1980s and 1990s it was sometimes possible for a reporter to get through to corporate executives who would talk openly and honestly. Now it’s very difficult unless you know them, they trust you, and you can get around the public relations department. These days the only time I can get through to most corporate officials at all is if I am responding to a press release they sent out.

    I suspect this stonewalling contributes to the number of dubious corporate projects get off the ground.

Join the discussion in the TidBITS Discourse forum

Participants

Avatar for ace Avatar for Simon Avatar for silbey Avatar for jeff1 Avatar for romad Avatar for pmvtutor Avatar for nellisks Avatar for doug2 Avatar for beatrixwillius Avatar for fischej Avatar for das Avatar for Shamino Avatar for wiecking