Apple plans two major upgrades to its Siri voice assistant next year, starting with the addition of customization features in iOS 26.4, before fully transforming the personal assistant into a complete chatbot in iOS 27. Barring any changes to the timeline, we expect to see the new Siri by June 2026. Here's a comprehensive overview of everything we know so far.

Switching to SiriBot

With the release of iOS 27, Apple will radically change how Siri works. Currently, Siri can answer basic questions and complete simple tasks, but it lacks the ability to engage in two-way conversations, assist with multi-step tasks, or handle complex questions.
Based on current rumors about the Siri bot, it will be able to do all that and more with the upcoming upgrade, and will operate in a similar manner to competing chatbots.
Apple initially had no plans to offer a full-fledged chatbot with users like Cloud or ChatGPT, but the immense popularity of these bots made it impossible to ignore. Simply adding AI capabilities to apps and features was no longer enough for Apple to remain competitive in a world where people rely on chatbots for everything from web searches to help with coding.
Google has already integrated the Gemini model into a wide range of Android devices, and other bots such as ChatGPT have hundreds of millions of weekly active users.
Siri's new capabilities

According to journalist Mark Gurman, Siri's chat capabilities will be deeply integrated into Apple products at the system level. Siri will no longer be just an app; it will be seamlessly integrated into iOS, iPadOS, and macOS, just as it is now.
Users will activate Siri in the same way they do now, either by saying its wake word like “Hey Siri” or by pressing the side button on compatible devices. Siri will be able to respond to both voice and text requests.
We don't yet know what the new Siri interface will look like. Apple will need to make significant changes to the appearance and feel of Siri if it wants to match the functionality offered by companies like OpenAI, Anthropic, and Google.
People are used to opening an app and having a full text interface that includes their conversation history, and it's unclear how Apple will provide that without a dedicated Siri app. Users will want access to their past conversations and tools to upload files and photos.
Enabling Siri would likely result in an interface resembling the apps that dominate the screen of iPhones, iPads, or Macs, but that would be a departure from Siri's current minimalist design. Alternatively, Apple might record conversations in a place like the Notes app, or in the clipboard on Macs.
Gorman suggests that Siri won't be just an app, but that might mean it won't be solely an app. There could be a dedicated chatbot app that users can utilize, while Siri remains system-wide and usable within various applications.
What can the Siri robot do?
It appears that Siri will be able to do everything current chatbots do, and more, including searching the web for information, generating images, creating content, summarizing information, analyzing uploaded files, using personal data to complete tasks, ingesting information from emails, messages, files, and more, analyzing open windows and on-screen content to take action, controlling device features and settings, and searching for content on the device, replacing the Spotlight feature.
Siri will also be integrated into Apple's core apps, including Mail, Messages, Apple TV, Xcode, and Photos. Siri will be able to search for specific images, edit photos, assist with coding, suggest TV shows and movies, and send emails.
The difference between Siri in iOS 26.4 and the chatbot

In iOS 26.4, Apple plans to introduce a new and updated version of Siri based on Large Language Models (LLMs). Apple has been working on this version since adding Apple Intelligence features to iOS 18, but it was delayed because Siri's underlying infrastructure needed a complete overhaul to run these large language models.
Starting with iOS 26.4, Siri will be able to have continuous conversations and provide human-like responses, along with new customization features, but it won't have full chatbot capabilities. Here's what we can expect:
Personal Context
Through personalized context, Siri will be able to track emails, text messages, files, photos, and more, learning more about you to help you complete tasks and remember what was sent to you. Examples include:
◉ Show me the files that Zaid sent me last week.
◉ Find the email in which Zaid mentioned something related to work.
◉ Look for the books that Zaid recommended to me.
◉ Where is the recipe that Umm Zaid sent me?
◉ What is my passport number?
Onscreen Awareness

This feature will allow Siri to see what's on your screen and perform actions related to what you're looking at. For example, if someone sends you an address in a text message, you can ask Siri to add it to their contacts. Or, if you're looking at a picture and want to send it to someone, you can ask Siri to do so for you.
Deeper integration with applications
Deeper integration means Siri will be able to do more within and across apps, performing actions and tasks currently impossible. While a complete picture isn't yet available, Apple has provided some examples:
◉ Transfer files from one application to another.
◉ Edit a picture and then send it to someone.
◉ Get directions to the house and share the expected arrival time with Zaid.
◉ Send the email that you have written to Zaid.
A chat-like interface for conversations with Siri will not be available with the release of iOS 26.4, but the personal assistant will be significantly different from what it is now. Craig Federighi, Apple's head of software engineering, told employees last summer that the Siri overhaul was a success, saying, "This has put us in a position not only to deliver what we announced, but to deliver a much bigger upgrade than we envisioned."
Siri redesign
With all the new features coming, Apple is planning to make changes to the visual design. It's not entirely clear what this will entail, but for the desktop robot in development, Apple has been testing an animated version of Siri resembling the Finder logo on Macs.
Apple may begin rolling out this new, more personalized design when Siri gets its major overhaul in iOS 27.
Memory and privacy
Chatbots like Cloud, ChatGPT, and Gemini have the ability to remember past conversations and interactions, maintaining a memory of the user. Apple is reportedly discussing the extent of Siri's memory capabilities, as the company may limit conversational memory to protect user privacy.
Label
Although Siri will get a complete overhaul, Apple will likely continue to refer to it as Siri, and it will simply be a smarter version of it.
Infrastructure and servers (Google deal)

Apple has struck a deal with Google that will see future versions of Siri powered by Gemini technology. Apple plans to use Gemini in iOS 26.4 updates, and Google's technology will also support the Siri robot.
In a joint statement released in January, the two companies said: Apple and Google have entered into a multi-year collaboration under which the next generation of Apple's foundational models will be based on Google's Gemini models and cloud technologies.
Siri will rely specifically on a custom AI model developed by Google's Gemini team. Gorman claims this custom model is comparable to Gemini 3 and will be significantly more powerful than the model behind the upcoming iOS 26.4 features.
Apple and Google are also discussing running the Siri bot on Google's servers powered by processing units (TPUs), possibly because Apple does not yet have the infrastructure to handle chatbot queries from billions of daily active devices.
In the future, Apple may be able to move Siri to a different underlying model, potentially allowing it to bypass Google once it has robust enough internal language models to compete with ChatGPT or Gemini. Apple may also be able to offer robot features in China through a partnership with a Chinese AI company, given that China restricts foreign companies from providing AI features within the country.
Supported platforms

Siri's chat functionality will be a key new feature in iOS 27, iPadOS 27, and macOS 27, integrating Siri's capabilities across iPhones, iPads, and Macs. Android-like features may also be added to other platforms such as VisionOS and tvOS.
the cost
There is no information yet on whether there will be any fees associated with Siri. The robot will not be able to operate entirely on the device, and Apple will need massive cloud processing power. Regardless of development and hosting costs, Apple pays Google nearly a billion dollars annually for access to its models.
Companies like Google and OpenAI spend billions annually on infrastructure and computing costs, and no AI service is completely free. Apple will likely need to charge some fees, but it might follow Google's approach with Gemini.
Google offers a free version of Gemini on Pixel phones and other Android devices with built-in AI. The basic version can answer questions, summarize texts, write emails, and control apps and smartphone features.
In contrast, Android users can pay $20 a month for an advanced Gemini version, to access the most advanced version that offers better inference capabilities, a longer context for analyzing large documents, and programming improvements.
launch date
Apple plans to introduce Siri's chatbot capabilities when it announces iOS 27, iPadOS 27, and macOS 27 at its Worldwide Developers Conference (WWDC) in June. If the chatbot features aren't ready, Apple will likely postpone showcasing the new functionality to avoid a repeat of the major misstep it made with iOS 18 and Apple's AI.
The Siri robot is expected to be rolled out in new updates in September after several months of beta testing.
Source:



7 comment