Mobile menu toggle
  1. Home
  2. /
  3. News
  4. /
  5. Apple Intelligence might take a backseat at WWDC25

Apple Intelligence might take a backseat at WWDC25

By

Apple Intelligence at WWDC25
WWDC25 won’t be as Apple Intelligence-packed as last year.
Image: Apple/D. Griffin Jones/Cult of Mac

Apple doesn’t have as many Apple Intelligence features to announce at WWDC25 as it did during last year’s developer conference. However, a handful of new AI features should arrive, including Apple Intelligence-generated Shortcuts automations, an Apple Intelligence API for developers, and AI-powered health tips. Apple’s foundation language model itself will also be improved, with versions in four different sizes currently in testing.

Here’s what to expect on Apple Intelligence next Monday during the WWDC25 keynote.

Apple Intelligence may not be a big focus at WWDC25

Apple Intelligence was announced by Craig Federighi, SVP of software engineering
Apple Intelligence was a significant part of WWDC24.
Photo: Apple

Apple introduced its suite of AI features under the branding “Apple Intelligence” last year at WWDC24. Apple entered the hotly competitive and highly contentious AI landscape with practical features that the company says anyone can use.

However, after Apple announced Writing Tools, Image Playground, Genmoji, Visual Intelligence and more, the company spent much of the last 12 months implementing those features in iOS 18.1, 18.2, 18.3 and 18.4. And a major project to make Siri smarter remains delayed, leading to a management shakeup in March.

Since Apple’s engineers spent most of the last year finishing and shipping those initial Apple Intelligence features, they didn’t have time to focus on big new AI features for this year’s updates.

That could leave this year’s keynote without a blockbuster segment. While the keynotes at WWDC23 and WWDC24 dedicated nearly 40 minutes of runtime to introducing Vision Pro and Apple Intelligence, respectively, it seems WWDC25 will be more business as usual, with only a handful of new features and other iterative improvements.

So, what Apple Intelligence features can we expect this year?

Apple Intelligence-powered Shortcuts

Settings and Device shortcuts
Shortcuts are a powerful, but complicated, tool.
Screenshot: D. Griffin Jones/Cult of Mac

Apple reportedly plans to update Shortcuts, the app that lets users build automations by visually piecing together actions that can control apps on their devices, with Apple Intelligence. According to Bloomberg’s Mark Gurman, “the new version will let consumers create those actions using Apple Intelligence models.”

Creating a Shortcut can be an intimidating process to newcomers — and tedious for experienced users. It would be a significant upgrade if you could describe in plain language what you wanted, and Apple Intelligence was smart enough to create the automation for you.

App Intents, the underlying API that powers Shortcuts, was also going to power a smarter Siri that could take a command and instantly carry it out on your behalf. Apple announced this feature at WWDC24 but delayed it in March 2025 to “the coming year.” That means it could be reannounced Monday.

An API for developers

Animation showing Apple Intelligence on iPhone
Apple Intelligence, as opposed to most other AI platforms, can run on-device.
Image: Apple

In iOS 18, developers can use a limited set of APIs to add Apple’s Writing Tools or Image Playground into their apps, but not much else. The big announcement this year is that developers will be able to use Apple foundation model APIs directly.

Currently, if a developer wants to add AI features to their app, like audio transcriptions or text summaries, they must use a third-party model. That means either paying API fees to another company, like Anthropic or OpenAI, or adding a small open-source model inside the app, which can reduce the quality, bloat the app size and slow down performance.

Giving developers the ability to use Apple Intelligence would bring many benefits. Developers could add AI features without paying exorbitant costs, because Apple Intelligence largely runs on-device. Developers wouldn’t need to integrate a third-party model into their apps, instead using a convenient API provided by Apple.

This would allow many more third-party apps you use, from Airbnb to Impulse to Paprika to NetNewsWire, the ability to build custom AI features at a much lower cost.

AI-powered health coach

Apple builds tech that acts as 'intelligent guardian' for user health
Apple previously hesitated to give direct health advice in its app.
Image: Apple

Project Mulberry is the internal codename for an AI coach that will analyze the information in the Health app to provide personalized recommendations. Apple is training this AI agent with data from its in-house physicians. According to Bloomberg, “the project remains deep in development” and will be released in spring 2026. As such, Apple might not mention it at WWDC25. If so, it likely will be qualified as “coming later.”

The AI health could reportedly will be part of a bigger redesign of Apple’s Health app. The company plans to make the app easier to navigate and understand. It also might add educational videos inside the app on topics like sleep, nutrition, physical therapy and mental health.

Apple’s foundation model is getting better

Apple Intelligence diagram
Apple Intelligence runs on a single foundation model with different Adapters that can power a wide variety of features.
Image: Apple

According to Bloomberg, Apple is testing several new versions of its foundation model: “Versions with 3 billion, 7 billion, 33 billion and 150 billion parameters are now in active use.” The largest model would run via Private Cloud Compute, Apple’s cloud AI servers.

While the latest foundation models apparently keep pace with recent versions of OpenAI’s models, Apple refuses to release a ChatGPT-like conversational chatbot “due to concerns over hallucinations and philosophical differences among company executives,” according to Bloomberg.

However, improvements to the foundation model will improve the quality of every other Apple Intelligence feature. The Writing Tools will produce higher-quality results, notifications will summarize notifications more accurately, etc.

Siri features remain in development hell

Screenshot of prompts to Siri: Play the podcast that my wife sent me the other day Delete my reminder to call Aileen Generate an image of a cat playing piano on the moon Add this photo to the email I drafted to Mayuri and Brian Move this to my Important Tasks list Summarize this email Create a new tab group Add this photo to my Birthday Inspiration Freeform boa Delete my Birthday Ideas tab group
These are the kinds of things you’ll be able to ask the new, smarter Siri.
Image: Apple

Apple continues working on LLM Siri, the ground-up redo of its much-maligned voice assistant. But the new version probably won’t be ready this year, having recently been put under new management. Still, “the hope is to finally give Siri a conversational interface,” according to Bloomberg.

A new feature, dubbed “Knowledge,” is a tool “that can pull in data from the open web,” like ChatGPT. This would likely answer open-ended questions you ask Siri by searching online, processing the top answers and reading you a summary of what it finds.

Robby Walker, who fumbled the ball on Siri before Mike Rockwell took over, is leading the development of Knowledge. Bloomberg says it’s “already been plagued by some of the same problems that delayed the Siri overhaul.”

More miscellaneous AI features

Code completion and Swift Assist in Xcode
Swift Assist will be able to write or refactor your code for you.
Image: Apple

That’s not all the Apple Intelligence features supposedly in the pipeline for WWDC25, either. According to Bloomberg, other minor updates include:

  • Apple reportedly will introduce a new low-power mode, which will use AI to “understand trends and make predictions for when it should lower the power draw of certain applications or features.” This should extend iPhone battery life.
  • The Translate app might get integrated with AirPods for live translation, with a redesign or rebranding likely.
  • Apple continues development of Swift Assist, the programming tool for developers that would be able to write new functions or refactor code. Apple mentioned it last year. This year, “the company is expected to provide an update” on when it will arrive, Bloomberg says. Internally, Apple engineers use a tool powered by Anthropic.
  • Apple also will introduce a new AI-powered tool for user interface testing, a challenging part of software development.

Finally, on a somewhat depressing note, Bloomberg notes that Apple will “quietly rebrand several existing features in apps like Safari and Photos as ‘AI-powered,’” possibly to make it seem like the Apple Intelligence team has been more productive than it actually has.

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.