×

Siri and Multiple Commands: The Problem Isn’t Language Understanding, It’s Execution

Apple dreams of a version of Siri that can handle complex requests like “Find the photo from Saturday, crop the background, and send it to my mom” in a single command, without the need for follow-up questions or manual switching between apps. This capability lies at the heart of the comprehensive Siri overhaul scheduled to be unveiled at the Worldwide Developers Conference (WWDC) on June 8 as part of iOS 27. The ability to issue multiple commands in one sentence is the key feature of this update, but the real problem is that it doesn’t work reliably yet.

From Phonegram: A computer screen showing a search bar with the prompt 'Ask anything', an option titled 'Tools', and hints for the latest Siri iOS 27 update appearing below an unclear title.


The Three-Tier Execution Chain

From Phonegram: A hand holding an iPhone displaying Siri settings after the Siri iOS 27 update, with the Siri icon glowing at the bottom of the screen.

At WWDC 2024, Apple defined the Siri upgrade in three parts: deeper linguistic understanding, contextual awareness linked to personal data, and expanded ability to take actions within and across apps. The first two layers enable Siri to “understand,” while the third layer is the true test of success or failure.

Natural language understanding means Siri’s ability to parse instructions even when the user stumbles over their words, which is a real shift from the old keyword system. As for context awareness, it requires Siri to be aware of what is on the screen and have access to personal content like photos and messages to understand what is meant by phrases like “the photo from Saturday.”

But execution is the weakest link; Siri must invoke structured actions in the correct sequence across apps it doesn’t always control. This is not a problem with language models, but with the reliability of the systems themselves, as beta versions of iOS 26.5 revealed that Siri still misinterprets requests or crashes when performing complex tasks.


The Challenge of Third-Party Apps and the App Intents Framework

From Phonegram: A timeline showing SiriKit, WidgetKit, Custom Intents, and App Intents, connected by arrows indicating progress from left to right, highlighting features introduced with the Siri iOS 27 update.

For every step in a sequential command, Siri needs a structured path within the relevant app. Apple has built this through the “App Intents” framework, which covers hundreds of actions. Within Apple apps like Photos, Messages, and Mail, integration is tight because Apple controls the system from start to finish.

For reference, App Intents is the technology that acts as a “translator” or “bridge” between system intelligence like Siri or the Shortcuts app and the functions within your apps.

However, for third-party apps, the capability depends entirely on developers adopting this framework. The gap here is clear; a request like “Find the photo, crop it, and send it” might work across Apple apps, but a request involving an external app like Dropbox might stumble if the developer hasn’t implemented the necessary requirements. Failure here doesn’t always provide a clear error; instead, execution might stop halfway, leaving the user confused.


Launch Delays Indicate the Difficulty of the Task

From Phonegram: A man in a modern room with a screen showing the phrase 'New Siri cancelled!' and two app icons, while another man stands with his head bowed, and a bold headline reads 'We admit our failure!' after the Siri iOS 27 update setback.

Apple has officially delayed the “more personalized” version of Siri several times, and it was targeted for release in previous updates like iOS 26.4, which was already released without the new Siri features. This pattern of delay points to a structural difficulty in the system; making Siri understand a complex sentence in a test environment is one thing, but making it execute three consecutive tasks accurately on millions of devices with unpredictable inputs is something else entirely.

Reports indicate that the rollout may extend between iOS 26.5 and iOS 27, suggesting that the execution layer is not consistent enough to be released as a single, integrated feature right now. It seems Apple is releasing what it trusts for now and continues to work on strengthening the remaining parts that are still struggling.


Upcoming Developers Conference

From Phonegram: A person speaking on stage at a tech conference with Siri and WWDC26 logos displayed on a large screen in the background, while the audience takes photos, highlighting the expected Siri update for iOS 27.

Apple will certainly dazzle us with a Siri demo on June 8, and everything will look perfect because the presentation is pre-prepared and tested. But the presentation is not what matters; the real test is what Apple says about reality. What matters is knowing which apps Siri will actually support and what tasks it can perform accurately from day one. This is the difference between a mere “promotional demo” and a real feature we use in our daily lives.

If Siri can turn our normal speech into actions and multiple steps within apps without us touching the screen, it will be a major, revolutionary change in how we use the iPhone. The real question now is not about Apple’s desire to achieve this, but whether the system has finally become robust and ready to execute these complex tasks and fulfill the promises we’ve heard from the company over the past two years?

Do you think Siri will finally become the assistant Apple promised us years ago?

Source:

apple.gadgethacks.com

Leave a Reply