Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

Digital Trends may earn a commission when you buy through links on our site.Why trust us?

Apple Intelligence is playing catch-up, here’s what you need to know

Apple's take on AI plans to fundamentally change the way you interact with its products

ByUpdated October 31, 2025
Add as a preferred source on GoogleSave
Apple's Craig Federighi presents the Image Playground app running on macOS Sequoia at the company's Worldwide Developers Conference (WWDC) in June 2024.
Apple

With so many AI companions out there, it only makes sense that Apple utilizes it too. Apple Intelligence is Apple’s take on AI and it plans to fundamentally change the way users can interact with its products. The hopes from Apple are to incorporate machine learning and advanced AI capabilities into every day devices.

While this is Apple’s plan, unfortunately it has fallen far behind in the AI race. Alternatives likeGeminifrom Google andChatGPT are already so far ahead to the point that I am considering ditching myiPhone for a Google Pixel.

Recommended Videos

Promising more conversational prose from Siri, automated proofreading and text summarization across apps, and lightning-fast image generation, Apple’s AI ecosystem is designed to enhance user experiences and streamline operations across its product lineup. Here’s everything you need to know about Apple’s supposed transformational AI.

Apple Intelligence release date and compatibility

Apple Intelligence was originally due to release in September 2024 alongside the roll out of iOS 18, iPadOS 18, and macOS Sequoia. However, this was then delayed for a month with a phased release finally beginning at the end of October 2025 starting with U.S English users. More languages and regions became available throughout 2024 and 2025.

You’re only able to use Apple Intelligence on the following devices:

  • iPhone 17 (including Pro and Pro Max)
  • iPhone Air
  • iPhone 16 (including 16e, Plus, Pro and Pro Max)
  • iPhone 15 Pro and Pro Max
  • iPad Pro (M1 and later)
  • iPad Air (M1 and later)
  • iPad Mini (A17 Pro)
  • MacBook Air (M1 and later)
  • MacBook Pro (M1 and later)
  • iMac (and Mac mini [M1 and later])
  • Mac Studio (M1 Max and later)
  • Mac Pro (M2 Ultra)
  • Apple Vision Pro (M2)

Apple Intelligence is still facing ongoing updates with all regions yet to be rolled out. These updates will continue through 2025.

Apple Intelligence features

Math Notes feature in iPadOS 18.
Apple

No matter what device you’re using Apple Intelligence with, the AI focuses primarily on three functions: writing assistance, image creation and editing, and enhancing Siri’s cognitive capabilities.

Apple Intelligence is designed to span the breadth and width of the company’s product line. As such, virtually every feature found in the macOS version of Apple Intelligenceis mirrored in the iOS and iPadOS versions. That includes Writing Tools, Image Playground, Memories in Photos, and Siri’s improvements.

In addition, iPadOS,when paired with Apple Pencil, unlocks more features. Smart Script in the Notes app, for example, straightens and smooths handwritten text in real time. The new Math Notes calculator will automatically solve written equations in the user’s own handwriting and generate interactive graphs based on those equations with a single tap.

We at Digital Trendstook an early version Apple Intelligence for a spin using macOS Sierra beta, but came away rather disappointed with what we’ve seen so far from the digital agent — a sentimentmirrored by many Apple Intelligence users. For one, only a fraction of the AI tools were actually available to use through the beta release. And the tools we did have access to, including the writing assistant, Siri, and audio transcription, proved buggy and unreliable.

By the time 18.1 was released, Apple had thankfully addressed many of those issues, putting Apple Intelligence on par with more established AI assistants, like Google’sGemini.

Writing Tools

Apple Intelligence's Writing Tools being used in macOS Sequoia.
Apple

The newWriting Tools feature can proofread the user’s writing and rewrite sections as necessary, as well as summarize text across Apple’s application ecosystem including Mail, Notes, and Pages. Third-party developers will be able to leverage Writing Tools in their own apps via API calls.

For example, within the Mail app, Apple Intelligence will provide the user with short summaries of the contents of their inbox, rather than showing them the first couple of lines of the email itself (though if you aren’t a fan of that feature,it’s easy to disable). Smart Reply will suggest responses based on the contents of the message and ensure that the reply addresses all of the questions posed in the original email. The app even moves more timely and pertinent correspondence to the top of the inbox via Priority Messages.

The Notes app has received significant improvements as well. With Apple Intelligence, Notes offers audio transcription and summarization features, as well as an integrated calculator, dubbed Math Notes, that solves equations typed into the body of the note.

Image Playground

The Image Playground being used with Apple Intelligence in macOS Sequoia.
Apple

Image creation and editing functions are handled by the new Image Playground app, wherein users can spin up generated pictures within seconds and in one of three artistic styles: Animation, Illustration, and Sketch. Image Playground is a standalone app, although many of its features and functions have been integrated with other Apple apps like Messages.

Apple Intelligence is also improving your device’s camera roll. The Memories function in thePhotos app was already capable of automatically identifying the most significant people, places, and pets in a user’s life, then curating that set of images into a coherent collection set to music. With Apple Intelligence, Memories is getting even better.

The AI can select photos and videos that best match the user’s input prompt (“best friends road trip to LA 2024,” for example), then generate a story line — including chapters based on themes the AI finds in the selected images — and assemble the whole thing into a short film. Photos users also now have access to Clean Up,a tool akin to Google’s Magic Eraser andSamsung’s Object Eraser, and improved Search functions.

Siri

Summoning Siri on an iPhone.
Nadeem Sarwar / Digital Trends

Perhaps the biggest beneficiary of Apple Intelligence’s new capabilities is Siri. Apple’s long-suffering digital assistant has been more deeply integrated into the operating system, with more conversational speech and improved natural language processing. You’ll have to manually enable the feature on your iPhone before you can use it, butdoing so is a simple task.

Siri can defer to ChatGPT for more complex queries, which we go further into down below.

What’s more, Siri’s memory now persists, allowing the agent to remember details from previous conversations, while the user can seamlessly switch between spoken and written prompts. Apple is reportedly working on an even more capable version of Siri, butits release may not come until 2026.

Apple Intelligence privacy

A diagram showing Apple's entire setup for AI computing.
Apple

With constant data leaks that other AI competitors have suffered, causing mistrust between developers and users, Apple was sure to prioritize privacy when designing Apple Intelligence. This led to Apple building its own private and secure AI compute cloud named PCC (Private Cloud Compute) to handle complex user queries.

Most of Apple Intelligence’s routine operations are handled on-device, using the company’s most recent generations of A17 and M-family processors. “It’s aware of your personal data, without collecting your personal data,” Apple’s senior vice president of Software Engineering stated at WDCC 2024.

“When you make a request, Apple Intelligence analyzes whether it can be processed on-device,” Federighi continued. “If it needs greater computational capacity, it can draw on Private Cloud Compute and send only the data that’s relevant to your task to be processed on Apple silicon servers.” This should drastically reduce the chances of private user data being hacked, intercepted, spied upon, and otherwise snooped while in transit between the device and PCC.

“Your data is never stored or made accessible to Apple,” he explained. “It’s used exclusively to fulfill your request and, just like your iPhone, independent experts can inspect the code that runs on these servers to verify this privacy promise.” The company is so confident in its cloud security that it isoffering up to a million dollars to anyone able to actually hack it.

Apple Intelligence will defer to ChatGPT on complex queries

ChatGPT and Siri integration on iPhone.
Nadeem Sarwar / Digital Trends

ChatGPT functionality including text generation and image analysis are integrated into Siri and Writing Tools. Furthermore, ChatGPT can step in if Siri’s onboard capabilities aren’t sufficient for the user’s query, except that ChatGPT will instead send the request to OpenAI’s public compute cloud rather than the PCC.

Users won’t have to leave the Siri screen when utilising ChatGPT’s capabilities. OpenAI’s chatbot functions in the background when it is called upon, and Siri will state the answer regardless of which AI handles the query. To ensure at least a semblance of privacy protections, the device will display a confirmation prompt to the user before transmitting their request, as well as for any documents or images the user has attached.

ChatGPT x Apple Intelligence—12 Days of OpenAI: Day 5

ChatGPT is accessible directly from their device’s user interface (regardless of whether it’s iOS, iPadOS, or MacOS) and users will have the option of either logging into their ChatGPT account to access it or using it anonymously.

You’ll also be able to access ChatGPT directly simply by telling Siri to have ChatGPT handle the task (i.e., “Siri, have ChatGPT assemble a holiday music playlist.”)

Apple Intelligence trained on Google’s Tensor Processing Units

Google's Tensor G2 chip.
Google

A research paper from Apple, published in July, reveals that the companyopted to train key components of the Apple Intelligence model using Google’s Tensor Processing Units (TPUs) instead of Nvidia’s highly sought-after GPU-based systems. According to the research team, utilizing TPUs allowed them to harness enough computational power needed to train its enormous LLM, as well as do so more energy efficiently than they could have using a standalone system.

This marks a significant departure from how business is typically done in AI training. Nvidia currently commands anestimated 70% to 95% of the AI chip market, so to have Apple opt instead for the product of its direct rival — and reveal that fact publicly — is highly unusual, to say the least. It could also be a sign of things to come. Nvidia’s market dominance couldn’t last forever — we’re already seeing today’s hyperscalers making moves into proprietary chip production.

BeyondGoogle’s ongoing TPU efforts, Amazon announced that it’s working on its own chip line, one that would outperform Nvidia’s current offerings by 50% while consuming half as much power.

Andrew Tarantola
Former Computing Writer
Andrew Tarantola is a journalist with more than a decade reporting on emerging technologies ranging from robotics and machine…
Topics
ChatGPT now has a Lockdown Mode, but should you enable it?
A digital safe mode that prioritizes caution over convenience.
chatgpt-chat-history-feature

ChatGPT has a new security feature called Lockdown Mode, but OpenAI is clear about one thing from the start: most people don’t need to turn it on.

The new mode is designed for users who face unusually high digital risk, such as journalists, activists, or people working in sensitive environments. For everyone else, OpenAI says the standard protections built into ChatGPT are already enough.

Read more
You’ll soon be able to cap your MacBook’s charge at 80 percent to preserve battery health
iPhone's Charge Limit feature is making its way to MacBooks with macOS Tahoe 26.4.
MacBook Pro in space grey sitting on a desk.

Apple's iPhone lineup has long included a charge limit feature designed to preserve battery health by preventing the device from charging to 100 percent. By capping the maximum charge level, users can reduce battery wear over time, particularly if their device spends long stretches plugged in. Now, that same functionality is making its way to MacBooks.

Apple has introduced a new Charge Limit option in the latest macOS Tahoe 26.4 beta. According to MacRumors, the feature allows users to set a maximum charge level between 80 percent and 100 percent, giving them direct control over battery longevity. Once enabled, the MacBook will stop charging when it reaches the selected limit rather than automatically topping off to full capacity.

Read more
Everything we expect from Apple’s March 4 event
From a lower-cost MacBook to the iPhone 17e and next-generation Studio Display, Apple's March 4 event could reshape its affordable and performance hardware lineup.
Food, Fruit, Plant

Apple has sent out invites to a special "experience" scheduled for March 4, 2026. For the first time in a while, the iPhone maker is holding press briefings in three different cities around the world: New York, London, and Shanghai, as a way to address three of its key markets.

Unlike the usual events that take place at Apple Park, the March 4 "experience" focuses more on providing localized, hands-on access to a higher number of new devices. These could include the purported iPhone 17e, four new MacBook models with the most-anticipated low-cost or affordable version, a new baseline iPad, iPad Air, Studio Display, and a couple of other products.

Read more

More AI

Branding Logo
Upgrade your inbox: sign-up for our newsletters
Check your inbox!

[8]ページ先頭

©2009-2026 Movatter.jp