×

Apple extracts the “essence” of Google Gemini to develop AI models that run locally on Phonegram devices

It seems Apple has decided to take the short and smart path in the world of artificial intelligence. Instead of reinventing the wheel (or in this case, re-reading all the data on the internet), it has decided to leverage Google’s long-standing expertise. Recent reports indicate that Apple has gained full and rare access to Google’s “Gemini” model within its own data centers, not just to use it as is, but to transform it into something that fits its own philosophy: privacy and offline functionality.

From iPhoneIslam.com, Apple logo on the left and Google logo on the right, separated by a vertical line on a dark gray iPadOS background.


The art of “Distillation”: How does Apple create small intelligence from a big giant?

The secret lies in a technical process called “Distillation.” The idea, simply put, is that Apple asks the massive and brilliant Gemini model to perform a set of complex tasks, then asks it to explain how it reached those results and the reasoning process it followed. This is where Apple steps in, taking these answers and logical reasoning processes to train its own smaller, more cost-effective models.

Google Gemini and Apple

In this way, Apple’s small models learn the “internal calculations” used by Gemini, resulting in highly efficient models that offer performance approaching that of Gemini, but requiring significantly less processing power. And the big advantage? These models can run directly on your Phonegram or iPad without needing to connect to external servers, which means greater speed and absolute privacy.


Siri in its new attire with iOS 27

Apple is not just stopping at distillation; it is also tweaking Gemini to suit its own taste. It is well known that Gemini is primarily programmed to be a chatbot or a programming assistant, and this does not always align with what Apple wants for its users. Therefore, the company is working on “refining” the model to be more compatible with the user’s daily needs, away from programming complexities.

From Phonegram website: A blue app icon with the number 27 in the center, covered with an abstract colorful ring design on a blue gradient background - perfect for showcasing iOS 27 features or chatbot integration.

With the arrival of iOS 27, we expect to see a highly advanced “chat” version of Siri, which will be capable of doing much of what Gemini currently does. Siri will no longer just set alarms; it will be able to summarize complex information, understand and scan uploaded documents, and even write stories and provide emotional support, all the way to practical tasks like booking flights and travel.


Between Google’s collaboration and Apple’s own efforts

Despite this close collaboration, do not think that Apple has put all its eggs in Google’s basket. The “Apple Foundation Models” team is still working day and night to develop the company’s own AI models that are completely independent of Gemini. Apple knows full well that total reliance on another party is a risk it does not like to take.

From iPhoneIslam.com, an abstract white shape intertwined with multiple smooth, multi-colored tubular rings on a dark background, reminiscent of the innovation seen in Apple and Google products.

Historically, Apple did not invent the graphical user interface, the mouse, or even the smartphone, but it “perfected” them and presented them to the world in the best possible form. Now, it seems to be repeating the same plan with artificial intelligence; it takes the raw technology from giants like Google, then polishes it and makes it easy, user-friendly, and runs seamlessly inside your pocket.

Do you think Apple’s collaboration with Google is an admission of falling behind or a smart move to save time?

Source:

macrumors.com

Leave a Reply